Description:
- Proven experience designing and managing data flow architectures in complex environments.
- Expertise in data transformation processes, including ETL/ELT pipelines and real-time processing.
- Strong knowledge of data integration tools such as Informatica, Talend, Apache Nifi, or equivalent.
- Familiarity with data streaming platforms (e.g., Kafka, Spark Streaming) and batch processing systems.
- Proficiency in SQL and scripting languages like Python, Scala, or Java for data manipulation and automation.
- Deep understanding of data governance, including data quality, lineage, and privacy regulations (e.g., GDPR).
- Experience with cloud platforms (AWS, Azure, or GCP) and data services (e.g., Snowflake, BigQuery, Databricks).
2. Must-Have Certifications
- Relevant certifications in data integration or data architecture (e.g., Informatica, AWS Data Analytics, Azure Data Engineer).
- Certifications in cloud platforms (AWS, Azure, GCP) are preferred.
3. What Is the Consultant Going to Be Doing on a Day-to-Day Basis?
- Design, document, and manage data flow and transformation architectures for enterprise data systems.
- Create and optimize ETL/ELT pipelines and support real-time data processing.
- Define and document data flows and lineage, visualizing "as-is" and "to-be" states.
- Work with data modeling, metadata management, and data governance teams to ensure accuracy, quality, and compliance with regulations.
- Collaborate with data quality/governance teams to identify and implement enhancements to data architecture.
- Liaise with business and technology stakeholders to understand data requirements and resolve issues during the SDLC process.
- Evaluate and recommend tools and technologies for data integration, quality, and transformation.
- Stay updated on emerging data technologies and propose innovative solutions for scalable, secure, and efficient data architecture.