Senior Big Data Integration Developer

CareFirst BlueCross BlueShield   •  

Baltimore, MD

Industry: Finance & Insurance

  •  

Less than 5 years

Posted 43 days ago

PURPOSE:

Independently designs and builds Big Data solutions leveraging Cloudera Big Data technology stack.

Working under Big Data manager supervision, performs Development activities, technical documentation, system performance support, and internal customer support.


PRINCIPLE ACCOUNTABILITIES: Under the direction of the Manager, DW, the incumbent is responsible for, but are not limited to, the following:


Duties And Responsibilities

System Design and Implementation: Designs and Builds Big Data solutions to meet business requirements. Assumes complete ownership of Delivery from Data Engineering stand point in given project(s)


Research and Development: Learns and implements new technologies to benefit the business


QUALIFICATION REQUIREMENTS

  • Proven experience with building and optimizing data pipelines, architectures, and datasets from both structured and unstructured datasets
  • Demonstrated ability to build processes that support data transformation, structures, metadata, accuracy checking, and workload management
  • Strong communication skills and self-starter mentality and the ability to think outside the box.
  • Cloudera Certified Hadoop and Spark Developer (CCA175)


Required Education and Experience:

Degree or equivalent experience: BA/BS in Computer Science, Information Systems, Information Technology or related field with 3+ years of prior experience in software development, Data Engineering and Business Intelligence OR equivalent experience.

  • 3+ years of strong programming background with Java/Python/Scala
  • 3+ years of Big Data development experience in Hadoop, Yarn, HDFS, MapReduce, Hive, Sqoop, Oozie and other related Big Data technologies
  • 3+ years of experience building Real Time streaming systems, using solutions such as Flume, Kafka and Apache Spark
  • Experience tuning Hadoop/Spark parameters to for optimal performance
  • Working knowledge of at least one NoSQL store (Hbase, Cassandra, and MongoDB etc.)
  • Experience with Big Data querying tools including Impala
  • Strong working experience of SQL and at least one major RDBMS (Oracle, DB2 etc.)
  • Advanced experience with shell scripting


Required Skills and Abilities:

Must be able to effectively work in a fast paced environment with frequently changing priorities, deadlines, and workloads that can be variable for long periods of time. Must be able to meet established deadlines and handle multiple customer service demands from internal and external customers, within set expectations for service excellence. Must be able to effectively communicate and provide positive customer service to every internal and external customer, including customers who may be demanding or otherwise challenging.


Preferred:

  • Healthcare experience
  • ETL solution experience, preferably on Hadoop
  • WebService/API integration, providing Data as a Service
  • Strong communication skills and self-starter mentality and the ability to think outside the box.