#SENIO41500

Senior Application Systems Analyst

2021-04-19
  • Location FRANKLIN, TN (CHS Corporate)
    Full Time
  • Department Architecture and Strategy
  • Field Information Systems
  • Location FRANKLIN, TN (CHS Corporate)

  • Department Architecture and Strategy

  • Field Information Systems

  • Full Time

Job Description

Community Health Systems is a leading operator of general acute care hospitals and outpatient care centers in communities across the United States. CHS affiliates own, lease or operate 83 affiliated hospitals in 16 states with an aggregate of approximately 13,000 licensed beds. Healthcare services are also provided in more than 1,000 outpatient sites of care including affiliated physician practices, urgent care centers, freestanding emergency departments, imaging centers, cancer centers, and ambulatory surgery centers.

CHSPSC, LLC seeks a Senior Application Systems Analyst for its Franklin, TN, headquarters’ Architecture and Strategy team. 

Summary:

The Senior Application Systems Analyst is responsible for hands-on design, development, and maintenance of highly scalable data acquisition, shaping, and delivery solutions in the Hadoop eco system. In addition, they gather and analyze business requirements and transform them into innovative DW/BI solutions that effectively facilitate and enhance the development of the company's enterprise integrated data capabilities in support of business objectives. Strong understanding of best practices and standards for large scale distributed application design and implementation is essential.

Job Responsibilities

  • Primary responsibility is to build reliable and scalable data ingestion and transformation programs to process data in real-time
  • Loading large volumes of data from several disparate datasets
  • Defining and managing Hadoop Job flows and schedules
  • Reviewing and managing log files
  • Assess the quality of datasets in the Hadoop data lake
  • Fine tune Hadoop applications for high performance and throughput, using HDFS formats and structure like Parquet, Avro, etc. to speed up processes
  • Troubleshoot and debug any Hadoop ecosystem run time issues

Required Skillset

  • Experience working in the Hadoop ecosystem and its components
  • Experience developing Spark, Scala, map reduce programs
  • Experience with scripting languages like Python or Perl, and regular expressions
  • Experience working in Linux/UNIX OS
  • Good knowledge of concurrency and multi-threading, and big/fast data concepts
  • Working knowledge of SQL, database structures, principles, and theories

Preferred Skillset

  • 1-2 years of experience in Big Data.
  • Experience in Healthcare Domain
  • Experience working in distributed, highly-scalable processing environments
  • Source Control, Build, and Continuous deployment Systems – Maven, Ant, Git, Jenkins, etc.
  • Config/Orchestration – Zookeeper, Puppet, Salt, Ansible, Chef
  • CCDH (Cloudera Certified Developer for Apache Hadoop)
Apply Now