Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best.

Optum seeking an innovative Software Data Engineer to join our distinguished team.

Job Designation : Software Data Engineer

Salary : 9 LPA – 10. 5 LPA

Qualification :  Bachelor’s Degree

Experience : Freshers / Experienced

Skill Set :

  1. Proficiency in SQL and experience with data modeling, ETL processes, and data warehousing solutions
  2. Familiar working with Big Data and cloud tools such as Spark, Scala, Python, Pyspark, Github Actions, Maven, Parquet, Kafka, Avro etc.
  3. Exhaustive experience in Snowflake, Azure Data Factory, Azure Databricks.
  4. Demonstrate attitude in gathering business requirements with asking right questions and putting the understanding in detailed flow diagrams.
  5. Familiarity with Streamlit, Docker, Gen AI APIs
  6. Core understanding in basics of distributed systems
  7. Effective communication skills, both written and oral.

Job Description :

The primary responsibilities include designing, developing, implementing, testing, deploying, monitoring, and maintaining systematic data delivery methods. This position encompasses all key development and maintenance activities across various technology functions to ensure the delivery of high-quality data for users, applications, and services. The roles demands to design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.

  1. Collaborate with lead data engineers, data scientists and architects to understand data requirements and design optimal data solutions
  2. Design, develop, implement, and manage cross-domain, modular, optimized, flexible, scalable, secure, reliable, and high-quality data solutions for meaningful analyses and analytics, ensuring operability
  3. Enhance data efficiency, reliability, and quality by developing performant data solutions
  4. Integrate instrumentation in the development process to monitor data pipelines, using measurements to detect internal issues before they cause user-visible outages or data quality problems
  5. Develop processes and diagnostic tools to troubleshoot, maintain, and optimize solutions, responding to customer and production issues
  6. Reduce technical debt and transform technology through the adoption of open-source solutions, cloud integration, and HCP assessments
  7. Maintain comprehensive documentation with clearly writing down process flows, data flows, code flows and other technical aspects of a process deployed in production
  8. Monitor and analyze system performance metrics, identifying areas for improvement and implementing solutions
  9. Interpret policies and leverage experience to solve issues
  10. Monitor and troubleshoot deployed models, proactively identifying and resolving performance or data-related issues
  11. Explore ways to enhance data quality and reliability
  12. Lead the development of new concepts, technologies, and products to meet emerging needs
  13. Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment).

Location : Bengaluru, Karnataka, India