Apache NiFi
Looking into data land, we’ve seen a lot of resting places to set your data and make these assets available to empower your use cases but something missing is usually: How do we get data into this data platforms?
This is where comes into the stage some tools like Apache NiFi, a project open sourced from Apache Foundation and whose main purpose by the creator statement:
“It was built to automate the flow of data between systems”, since the term data flow can be used in several contexts inside in our modern data-platforms and for this specific series of articles covering NiFi. For me the premise in the next article will be: how is it possible to perform ETL from a wide ecosystem using a single and unified data bus, in order to cover the most normal use cases in a small/medium company.
Is it possible to find in NiFi a good solution in terms of harmonization while planning data ingestion inside your data platform?
Something I want to figure out while breaking the siloes is the capacity on the selling side of NiFi: low code and almost up to 280 already available connectors to make your platform working as expected.
The only way to having fun while learning something, is the pragmatical way: giving it a try.
Needing more info?
https://nifi.apache.org/docs/nifi-docs/html/overview.html#eip