As a Data Engineer, you will play a vital role in designing and developing data pipelines that ingest, validate, and export threat intelligence data.
Key Responsibilities:
* Develop and maintain robust data pipelines for ingesting threat intelligence data from various sources into our data ecosystem.
* Implement data validation processes to ensure accuracy, completeness, and consistency of the data.
* Collaborate with threat analysts to design appropriate solutions for their data requirements.
* Automate data export processes to external systems or partners using efficient workflows.
* Optimize existing data pipelines for improved performance and scalability, ensuring seamless data availability and integrity.
* Troubleshoot issues in data pipelines, providing swift resolution to ensure continuous data flow.
* Document technical specifications, data flows, and procedures for maintaining and supporting data pipelines.
* Stay updated on emerging technologies and best practices in data engineering, incorporating them into our data ecosystem.
* Provide guidance and support to team members on data engineering best practices and methodologies.
Requirements:
* Proven experience as a Data Engineer or similar role, focusing on data ingest, validation, and export automation.
* Strong proficiency in Python programming language.
* Experience with data pipeline orchestration tools such as Apache Airflow, Apache NiFi, or similar.
* Familiarity with cloud platforms like Snowflake, AWS, Azure, or Google Cloud Platform.
* Knowledge of data validation techniques and tools for ensuring data quality.
* Experience building and deploying images using containerization technologies like Docker and Kubernetes.
* Excellent problem-solving skills and attention to detail.
* Strong communication and collaboration skills, enabling effective teamwork.
Skills and Qualifications:
* Proficiency in Python programming language is essential.
* Experience with data pipeline orchestration tools is highly desirable.
* Familiarity with cloud platforms is necessary.
* Knowledge of data validation techniques and tools is required.
* Experience with containerization technologies is an advantage.
Benefits:
* Opportunity to work on cutting-edge data engineering projects.
* Collaborative and dynamic work environment.
* Professional growth and development opportunities.
* Recognition and rewards for outstanding performance.