Resume Score
CV/Résumé Score
  • Expertini Resume Scoring: See how well your CV/Résumé matches this job: IT engineer Data Engineering Tech Lead.
Bengaluru Jobs Expertini

Urgent! IT engineer Data Engineering - Tech Lead Jobs | Continental

IT engineer Data Engineering Tech Lead



Job description

Job Description

• Govern data acquisition and ingestion architecture for enterprise-wide data & analytics integration.
• Establish reusable, scalable, and governed interfaces to all source systems across business domains.

• Define standards and ensure engineering quality across ingestion pipelines and upstream data flows.

• Act as escalation and design authority for ingestion, unstructured data, and advanced analytics integration points.

• Oversees ingestion and integration design across 100+ data sources.

• Supports 25+ data engineering and data science professionals with ingestion architecture and troubleshooting.

• Interfaces with source system experts, data scientists, platform architects, and functional IT stakeholders.

Main Tasks

• Define ingestion frameworks for batch, streaming, and APIs.

• Establish standards for JSON, CSV, XML, and unstructured data.

• Govern manual uploads and exception workflows.

• Collaborate with system owners to define interface specifications.

• Manage schema evolution and API reliability.

• Align ingestion architecture with platform and business requirements.

• Collaborate on ingestion for data science and ML pipelines

• Ensure robust API and event-driven integration for AI/LLM use cases.

• Support complex acquisition scenarios with scalability requirements.

• Review high-risk ingestion pipelines for performance and security.

• Support data engineers on error handling, retries, and load orchestration.

• Ensure ingestion aligns with governance and architecture.

• Maintain ingestion libraries and templates.

• Promote reuse and automation across engineering teams.

• Lead best practices in TDD, code reviews, and pipeline testing.

• Maintain architectural blueprints, templates, and best practices.

• Publish design guidelines and coding standards.

• Create re-usable architecture patterns for lakehouse environments.

• Monitor usage and implement auto-scaling policies.

• Analyze and optimize cluster configurations for cost-efficiency.

• Provide cost transparency and usage reporting to stakeholders.

Qualifications

Degree in Computer Science or related field; certifications in Databricks or Microsoft Azure preferred.

6–10 years in data engineering with strong focus on enterprise-scale data & analytics integration, including ML and LLM scenarios.

Experience building robust ingestion frameworks and reusable components.

Track record of architectural ownership and peer enablement with diverse teams.

Experience working in international teams across multiple time zones and cultures, preferably with teams in India, Germany, and the Philippines.

Additional Information

The well-being of our employees is important to us.

That's why we offer exciting career prospects and support you in achieving a good work-life balance with additional benefits such as:

  • Training opportunities
  • Mobile and flexible working models
  • Sabbaticals
  • and much more...

    Sounds interesting for you?

    are important to us and make our company strong and successful.

    We offer equal opportunities to everyone - regardless of age, gender, nationality, cultural background, disability, religion, ideology or sexual orientation.


    Required Skill Profession

    Computer Occupations



    Your Complete Job Search Toolkit

    ✨ Smart • Intelligent • Private • Secure

    Start Using Our Tools

    Join thousands of professionals who've advanced their careers with our platform

    Rate or Report This Job
    If you feel this job is inaccurate or spam kindly report to us using below form.
    Please Note: This is NOT a job application form.


      Unlock Your IT engineer Potential: Insight & Career Growth Guide