Data practitioners and leaders should take an agile, incremental, and business-centered approach to modernizing their data pipelines and environment.
- Adaptability. Data engineers and other stakeholders should treat data pipelines as living organisms rather than static systems. New users, use cases, risks, and opportunities will require ongoing changes to your data environment. Such changes might include new sources, targets, platforms, and connections. Design your architecture and pipelines for adaptability by investing in open tools, APIs, and data formats.
- Incremental change. Data teams must maintain uptime and resiliency. They must adapt to new business and technical requirements by making incremental changes. Rather than ripping and replacing a platform or set of pipelines, you should phase in new elements in a modular fashion, one at a time, gauging progress along the way.
- Business-driven evaluation criteria. When it comes time to select a pipeline tool, be sure to have business objectives shape your evaluation criteria. Advanced, flashy tools might impress engineers, but drive up compute costs or training requirements for business users.
Data engineers and other stakeholders should treat data pipelines as living organisms rather than static systems. This requires an adaptable architecture with open tools, APIs, and data formats
Best Practices for Modern Data Pipelines