Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Software Engineer II Databricks/Spark + Python, AWS.
Bengaluru Jobs | Expertini

Urgent! Software Engineer II - Databricks/Spark + Python, AWS - Local Job Opening in Bengaluru

Software Engineer II Databricks/Spark + Python, AWS



Job description

Kick-start your software engineering career as an entry-level Software Engineer I.

Be part of a team that is pushing the boundaries of what's possible.

As a Software Engineer I at JPMorgan Chase within the Commercial and Investment Bank's Global Banking Team, you are part of an agile team that works to enhance, design, and deliver the software components to the firm’s state-of-the-art technology products in a secure, stable, and scalable way.

As an entry-level member of a software engineering team, you execute basic software solutions through the design, development, and troubleshooting of a single technical area within a business function, while gaining skills and experience to grow within your role.

Job responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Work with large datasets using Spark on Data bricks or AWS EMR.
  • Write efficient SQL queries for data extraction, transformation, and analysis.
  • Collaborate with data scientists, analysts, and other engineering teams to deliver high-quality data solutions.
  • Implement data processing workflows on AWS services such as S3, ECS, Lambda, EMR, and Glue.
  • Develop and maintain Python scripts for data processing and automation.
  • Ensure data quality, integrity, and security across all data engineering activities.
  • Troubleshoot and resolve data-related issues in a timely manner.
  • Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 2+ yearsapplied experience
  • Minimum 3+ years of overall experience in IT.
  • Proven expertise in Data Engineering with Spark.
  • Hands-on experience with Databricks or AWS EMR.
  • Strong knowledge of SQL and database concepts.
  • Experience in ETL and data processing workflows.
  • Proficiency in AWS services: S3, ECS, Lambda, EMR/Glue.
  • Advanced skills in Python programming.
  • Excellent problem-solving and analytical abilities.
  • Preferred qualifications, capabilities, and skills

  • Experience with Infrastructure as Code (IaaC) using Terraform or Cloud Formation.
  • Familiarity with writing unit test cases for Python code.
  • Knowledge of version control systems such as Bit Bucket or GitHub.
  • Understanding of CI/CD pipelines and automation tools.

  • Required Skill Profession

    Computer Occupations



    Your Complete Job Search Toolkit

    ✨ Smart • Intelligent • Private • Secure

    Start Using Our Tools

    Join thousands of professionals who've advanced their careers with our platform

    Rate or Report This Job
    If you feel this job is inaccurate or spam kindly report to us using below form.
    Please Note: This is NOT a job application form.


      Unlock Your Software Engineer Potential: Insight & Career Growth Guide