Filters

Location

me

Job Type

Full-time

Part-time

Contractual

Hourly

Intership

Cyber Security Engineering Specialist - Data Engineer L4 Franklin

CHS Corporate

CHS Corporate Franklin TN United States

2 weeks ago

Community Health Systems is one of the nation’s leading healthcare providers. Developing and operating healthcare delivery systems in 44 distinct markets across 15 states, CHS is committed to helping people get well and live healthier. CHS operates 78 acute-care hospitals and more than 1,000 other sites of care, including physician practices, urgent care centers, freestanding emergency... departments, occupational medicine clinics, imaging centers, cancer centers and ambulatory surgery centers.

Summary:

Are you looking to solve the most interesting problems at the cross-roads of Data Engineering and Cyber Security?

As a Cyber Security Engineering Specialist, Data Engineer L4 for the Cyber Security Risk Management organization you’ll be responsible for acquiring, curating, and publishing data for analytical or operational uses. You will prepare data for use by data scientists, business users, and technology platforms by creating a single version of the truth for all data consumers. You will work with streaming and batch-loading data sources from cybersecurity solutions. Successful data engineers have the skills to design, build, and maintain reliable data pipelines and ETL processes to feed databases and data warehouses using a variety of tools and techniques. You will have the opportunity to work with various programming languages, technologies, and both structured and unstructured data.

A qualified candidate is:
• Lifelong Learner and Passionate about Technology
• Derives joy from tackling complex problems and working through solution tradeoffs
• Able to learn on the fly and fill knowledge gaps on demand
• Able to work with a variety of people at various levels
• Excellent data management and QA skills – Process Oriented
• Able to debug problems to their root cause, especially when the path leads through multiple systems or environments
• Interest in working with data at the protocol
• Aptitude for data presentation and ability to transform raw data into meaningful, actionable reports
• Significant experience creating data pipelines and ETL processes
• Experienced with Google Cloud Data Fusion or similar data integration services
• Experienced with BigQuery or other data warehouse products
• Excellent communication ability

Essential Duties and Responsibilities:
• Consults on complex data product projects by analyzing moderate to complex end to end data product requirements and existing business processes to lead in the design, development and implementation of data products.
• Builds data cleansing, imputation, and common data meaning and standardization routines from source systems by understanding business and source system data practices and by using data profiling and source data change monitoring, extraction, ingestion, and curation of data flows.
• Responsible for producing data views and data flows for varying client demands such as dimensional data, standard and ad hoc reporting, data feeds, dashboard reporting, and data science research & exploration.
• Translates business data stories into a technical story breakdown structure and work estimate so value and fit for a schedule or sprint.
• Creates business user access methods to structured and unstructured data by such techniques as mapping data to a common data model, transforming data as necessary to satisfy business rules and validation of data content.
• Collaborates with enterprise teams and other internal organizations on CI/CD best practices experience using Google Tables, JIRA, Jenkins, Confluence etc.
• Implements production processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
• Develops and maintains scalable data pipelines for both streaming and batch requirements and builds out new API integrations to support continuing increases in data volume and complexity
• Writes and performs data unit/integration tests for data quality with input from a business requirements/story, creates and executes testing data and scripts to validate that quality and completeness criteria are satisfied. Can create automated testing programs and data that are reusable for future code changes.
• Practices code management and integration with engineering Git principle and practice repositories.
• Participates as an expert and learner in team tasks for data analysis, architecture, application design, coding, and testing practices.

Qualifications:

Education:
• Undergraduate studies in computer science, management information systems, business, statistics, math, a related field required.
• Graduate studies in business, statistics, math, computer science or a related field are preferred.

Required Experience:
• Five to eight years of relevant experience with data quality rules, data management organization/standards and practices.
• Three to five years’ experience in data warehousing, statistical analysis, data models, and queries.
• Experience with Cloud technology and infrastructure including security and access management.
• Data application and practice knowledge
• Strong problem solving, oral and written communication skills.
• Ability to influence, build relationships, negotiate and present to senior leaders.
• Experience manipulating, processing and extracting value from large disconnected datasets
• Advanced query authoring (SQL)
• Advanced working knowledge of a variety of databases

Preferred Experience:
• Healthcare/Insurance/financial services industry knowledge
• Cyber Security experience
• Experience with developing compelling stories and distinctive visualizations
• Experience with AI / Machine Learning a plus

Computer Skills Required:
• Advanced skills with modern programming and scripting languages (e.g., SQL, R, Python, Spark, UNIX Shell scripting, Perl, or Ruby).
• Desired experience in: Looker / Google Data Studio, BigQuery, Google Cloud Data Fusion

Physical Demands:

In order to successfully perform this job, with or without a reasonable accommodation, the following are outlined below:
• The Employee is required to read, review, prepare and analyze written data and figures, using a PC or similar, and should possess visual acuity.
• The Employee may be required to occasionally climb, push, stand, walk, reach, grasp, kneel, stoop, and/or perform repetitive motions.
• The Employee is not substantially exposed to adverse environmental conditions and; therefore, job functions are typically performed under conditions such as those found within general office or administrative work
Franklin TN USA

Salary Criteria

Suggestions

people

Robert.D

people

Robert.D

people

Robert.D

people

Robert.D

people

Robert.D

Copyright © 2023 Fonolive. All rights reserved.