Logo CareOregon
Destaque

IS Data Engineer

CareOregonvia Teal
RemotoUsSêniorCLT8 dias atrás

Salário Estimado

R$ 12.870,00 - R$ 19.305,00

Descrição da Vaga

the position The IS Data Engineer plays a pivotal role in operationalizing and advancing data and analytics for CareOregon’s business initiatives.


This involves building, managing and optimizing data pipelines and moving them effectively into production for data and analytics consumers.


Consumers include business data analysts, data scientists and other roles that need curated data for data and analytics use cases.


The IS Data Engineer ensures compliance with data governance and data security requirements while enabling faster data access, integrated data reuse and acceleration of time-to-solution for CareOregon’s data and analytics initiatives.


Estimated Hiring Range: \$124,200.00 - \$151,800.00 Bonus Target: Bonus - SIP Target, 5% Annual Current CareOregon Employees: Please use the internal Workday site to submit an application for this job.


Responsibilities • Create, maintain and optimize data pipelines as workloads move from development to production for specific use cases.

Manage data pipelines through stages, beginning with ingestion of data sources through integration to consumption for specific use cases. • Utilize innovative tools, techniques and architectures to partially or completely automate tasks in order to minimize manual processes, reduce the potential for error and improve productivity.
Assist with the renovation of data management infrastructure that supports automation in data integration and management. • Partner with other Information Systems teams, business data analysts and other data and analytics consumers to refine their data requirements for initiatives and consumption.
Train data and analytics consumers about data pipelines and preparation techniques to make it easier for them to integrate and consume the data they need for their own use cases. • Apply understanding of data and domains to address emerging data requirements.
Propose innovative data ingestion, preparation, integration and operationalization techniques to optimally address data requirements. • Promote CareOregon’s available data and analytics capabilities and expertise to IS staff and department leaders.
Collaborate with and educate staff and leadership on how to leverage data and analytics capabilities to achieve business goals.

Requirements • Minimum 5 years’ experience in data management, RDBMs required in roles that included all or most of the following functions:

Database design and development experience • ETL experience
Development utilizing tools such as Microsoft SQL Server, Snowflake and/or similar tools • Data warehouse technical development that encompasses the data management life cycle and establishes end-to-end data warehousing, data management and analytics architecture
Experience with multi-source and multi-data from different format sources and/or data structure • In depth knowledge of commonly used database programming languages for relational databases (e.g.

SQL) • In depth knowledge of commonly used cloud-based data warehouse platforms (e.g.


Snowflake, Redshift, etc.) • Understanding of business intelligence solutions including working knowledge of commonly used data discovery, analytics and BI software tools for semantic layer-based data discovery (e.g.


Tableau, Power BI, etc.) • Knowledge of emerging data ingestion and integration technologies

Strong ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management • Strong ability to work with IT and business staff to integrate analytics and data science output into business processes and workflows
Strong ability to partner with data science teams to leverage data science and refine and optimize machine learning models and algorithms • Strong ability to collaborate with data governance, quality and security experts to move data pipelines into production in compliance with applicable standards and certification
Ability to work across multiple deployment environments including cloud, on-premises and hybrid • Ability to work with multiple operating systems and containerization platforms (e.g.

Docker, Kubernetes, AWS Elastic Container Service, etc.) • Ability to develop using Microsoft Azure products (e.g.


Data Factory, Functions, Databricks, Monitor, etc.) • Ability to work with large, heterogeneous datasets to build and optimize data pipelines, pipeline architectures and integrated datasets

Ability to extract business value while considering automation opportunities • Adept in the use of traditional data integration technologies including ETL/ELT, data replication/CDC and API design and access
Strong ability to work with and optimize existing ETL/ELT processes and data integration, data preparation flows and helping to move them into production • Strong ability to work with analytics tools for object-oriented/object function scripting using R, Python, Java, Scala and/or similar languages
Strong ability to apply Agile methodologies • Ability to apply DevOps practices and tools and DataOps principles to data pipelines to improve data flows
Possess curiosity and desire for ongoing learning about new data initiatives and how to address them • Ability to continually learn the latest versions of development tools and software products
Excellent written and oral communication skills • Ability to successfully manage multiple tasks, concurrent high priority projects and continuous deadlines
Possess a high degree of initiative, motivation, self-discipline and good judgment • Ability to work effectively with diverse individuals and groups
Ability to learn, focus, understand, and evaluate information and determine appropriate actions • Ability to accept direction and feedback, as well as tolerate and manage stress
Ability to see, read, and perform repetitive finger and wrist movement for at least 6 hours/day • Ability to hear and speak clearly for at least 3-6 hours/day Nice-to-haves
10 years of experience ( excluding internship and educational experience) • Data management experience within the healthcare industry preferred.
Utilization of data integration, modeling, optimization and data quality improvement processes • Development using Microsoft Azure products such as Data Factory, Functions, Databricks, Monitor and/or similar products
Knowledge of the basic concepts of managed care preferred • Knowledge of health insurance business entities, relationships and processes preferred Benefits
We offer a strong Total Rewards Program.

This includes competitive pay, bonus opportunity, and a comprehensive benefits package. • Eligibility for bonuses and benefits is dependent on factors such as the position type and the number of scheduled weekly hours.

Benefits-eligible employees qualify for benefits beginning on the first of the month on or after their start date. • CareOregon offers medical, dental, vision, life, AD&D, and disability insurance, as well as health savings account, flexible spending account(s), lifestyle spending account, employee assistance program, wellness program, discounts, and multiple supplemental benefits (e.g., voluntary life, critical illness, accident, hospital indemnity, identity theft protection, pre-tax parking, pet insurance, 529 College Savings, etc.).
We also offer a strong retirement plan with employer contributions. • Benefits-eligible employees accrue PTO and Paid State Sick Time based on hours worked/scheduled hours and the primary work state.
Employees may also receive paid holidays, volunteer time, jury duty, bereavement leave, and more, depending on eligibility. • Non-benefits eligible employees can enjoy 401(k) contributions, Paid State Sick Time, wellness and employee assistance program benefits, and other perks.
Please contact your recruiter for more information.

Requisitos

  • Minimum 5 years’ experience in data management, RDBMs required in roles that included all or most of the following functions:
  • Database design and development experience
  • ETL experience
  • Development utilizing tools such as Microsoft SQL Server, Snowflake and/or similar tools
  • Data warehouse technical development that encompasses the data management life cycle and establishes end-to-end data warehousing, data management and analytics architecture
  • Experience with multi-source and multi-data from different format sources and/or data structure
  • In depth knowledge of commonly used database programming languages for relational databases (e.g. SQL)
  • In depth knowledge of commonly used cloud-based data warehouse platforms (e.g. Snowflake, Redshift, etc.)
  • Understanding of business intelligence solutions including working knowledge of commonly used data discovery, analytics and BI software tools for semantic layer-based data discovery (e.g
  • Tableau, Power BI, etc.)
  • Knowledge of emerging data ingestion and integration technologies
  • Strong ability to work with IT and business staff to integrate analytics and data science output into business processes and workflows
  • Strong ability to partner with data science teams to leverage data science and refine and optimize machine learning models and algorithms
  • Strong ability to collaborate with data governance, quality and security experts to move data pipelines into production in compliance with applicable standards and certification
  • Ability to work across multiple deployment environments including cloud, on-premises and hybrid
  • Ability to work with multiple operating systems and containerization platforms (e.g
  • Docker, Kubernetes, AWS Elastic Container Service, etc.)
  • Ability to develop using Microsoft Azure products (e.g
  • Data Factory, Functions, Databricks, Monitor, etc.)
  • Ability to extract business value while considering automation opportunities
  • Adept in the use of traditional data integration technologies including ETL/ELT, data replication/CDC and API design and access
  • Strong ability to work with and optimize existing ETL/ELT processes and data integration, data preparation flows and helping to move them into production
  • Strong ability to work with analytics tools for object-oriented/object function scripting using R, Python, Java, Scala and/or similar languages
  • Strong ability to apply Agile methodologies
  • Ability to apply DevOps practices and tools and DataOps principles to data pipelines to improve data flows
  • Possess curiosity and desire for ongoing learning about new data initiatives and how to address them
  • Ability to continually learn the latest versions of development tools and software products
  • Excellent written and oral communication skills
  • Ability to successfully manage multiple tasks, concurrent high priority projects and continuous deadlines
  • Possess a high degree of initiative, motivation, self-discipline and good judgment
  • Ability to work effectively with diverse individuals and groups
  • Ability to learn, focus, understand, and evaluate information and determine appropriate actions
  • Ability to accept direction and feedback, as well as tolerate and manage stress
  • Ability to see, read, and perform repetitive finger and wrist movement for at least 6 hours/day
  • Ability to hear and speak clearly for at least 3-6 hours/day
  • 10 years of experience ( excluding internship and educational experience)
  • Utilization of data integration, modeling, optimization and data quality improvement processes

Responsabilidades

  • The IS Data Engineer plays a pivotal role in operationalizing and advancing data and analytics for CareOregon’s business initiatives
  • This involves building, managing and optimizing data pipelines and moving them effectively into production for data and analytics consumers
  • Consumers include business data analysts, data scientists and other roles that need curated data for data and analytics use cases
  • The IS Data Engineer ensures compliance with data governance and data security requirements while enabling faster data access, integrated data reuse and acceleration of time-to-solution for CareOregon’s data and analytics initiatives
  • Create, maintain and optimize data pipelines as workloads move from development to production for specific use cases
  • Manage data pipelines through stages, beginning with ingestion of data sources through integration to consumption for specific use cases
  • Utilize innovative tools, techniques and architectures to partially or completely automate tasks in order to minimize manual processes, reduce the potential for error and improve productivity
  • Assist with the renovation of data management infrastructure that supports automation in data integration and management
  • Partner with other Information Systems teams, business data analysts and other data and analytics consumers to refine their data requirements for initiatives and consumption
  • Train data and analytics consumers about data pipelines and preparation techniques to make it easier for them to integrate and consume the data they need for their own use cases
  • Apply understanding of data and domains to address emerging data requirements
  • Propose innovative data ingestion, preparation, integration and operationalization techniques to optimally address data requirements
  • Promote CareOregon’s available data and analytics capabilities and expertise to IS staff and department leaders
  • Collaborate with and educate staff and leadership on how to leverage data and analytics capabilities to achieve business goals
  • Strong ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management
  • Ability to work with large, heterogeneous datasets to build and optimize data pipelines, pipeline architectures and integrated datasets
  • Development using Microsoft Azure products such as Data Factory, Functions, Databricks, Monitor and/or similar products

Benefícios

We offer a strong Total Rewards Program
This includes competitive pay, bonus opportunity, and a comprehensive benefits package
Eligibility for bonuses and benefits is dependent on factors such as the position type and the number of scheduled weekly hours
Benefits-eligible employees qualify for benefits beginning on the first of the month on or after their start date
CareOregon offers medical, dental, vision, life, AD&D, and disability insurance, as well as health savings account, flexible spending account(s), lifestyle spending account, employee assistance program, wellness program, discounts, and multiple supplemental benefits (e.g., voluntary life, critical illness, accident, hospital indemnity, identity theft protection, pre-tax parking, pet insurance, 529 College Savings, etc.)
We also offer a strong retirement plan with employer contributions
Benefits-eligible employees accrue PTO and Paid State Sick Time based on hours worked/scheduled hours and the primary work state
Employees may also receive paid holidays, volunteer time, jury duty, bereavement leave, and more, depending on eligibility
Non-benefits eligible employees can enjoy 401(k) contributions, Paid State Sick Time, wellness and employee assistance program benefits, and other perks

Vagas Semelhantes

RemotoSão Paulo6 dias atrás

R$ 16k - 23k/mês

SêniorCLT

Descrição da empresa Na Bosch, moldamos o futuro por meio das inovações tecnológicas de alta qualidade e de serviços que despertam entusiasmo e melhoram a vida das pessoas. Temos uma promessa sólida para nossos colaboradores: crescemos juntos, gostamos do nosso trabalho e inspiramos uns aos outros. ...

RemotoSão Paulo8 dias atrás

R$ 16k - 23k/mês

SêniorCLT

Descrição da empresa Na Bosch, moldamos o futuro por meio das inovações tecnológicas de alta qualidade e de serviços que despertam entusiasmo e melhoram a vida das pessoas. Temos uma promessa sólida para nossos colaboradores: crescemos juntos, gostamos do nosso trabalho e inspiramos uns aos outros. ...

RemotoUs9 dias atrás

R$ 13k - 19k/mês

SêniorCLT

Job Title: Senior GenAI Engineer Job Category: Information Technology Time Type: Full time Minimum Clearance Required to Start: Secret Employee Type: Regular Percentage of Travel Required: Up to 10% Type of Travel: Local • * * The Opportunity: Join our team as a Senior GenAI Engineer supporting Depa...

CACI values the unique contributions that every employee brings to our company and our customers - every dayYou’ll have the autonomy to take the time you need through a unique flexible time off benefit and have access to robust learning resources to make your ambitions a realityA focus on continuous growth
RemotoAnápolis, Goiás, Br9 dias atrás

R$ 12k - 18k/mês

SêniorCLT

Buscamos um(a) Desenvolvedor(a) de Software Sênior - (Java) para atuar como referência técnica no desenvolvimento de soluções robustas, escaláveis e seguras. Garantir que as decisões técnicas estejam alinhadas à estratégia do produto e aos objetivos de negócio, promovendo qualidade, performance e su...

Interessado nesta vaga?

Candidatar-se

Você será redirecionado para o site original

Informações

NívelSênior
ContratoCLT
LocalUs
RemotoSim
MoedaBRL
Publicada8 dias atrás
FonteTeal

Análise de Vaga com IA

Estimativa salarial, match de tecnologias e análise de requisitos feitos com Inteligência Artificial

Quer se preparar melhor? Pratique entrevistas com IA no Recrutadoria ou melhore suas habilidades no BitMentor

← Voltar às Vagas