Consultant, Data Engineering

MasterCard   •  

O Fallon, MO

Not Specified years

Posted 308 days ago

This job is no longer available.

Who is Mastercard?

We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives.  We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities.

Job Title

Consultant, Data EngineeringHave you ever wanted to be a part of something BIG? Do you have experience integrating new and emerging technologies into existing environments as technologies evolve and demands change? Are you eager to work with open source software and be exposed to the fast-paced world of Big Data technology? In this role, you can make an immediate impact for a Leading Global Technology Company, MasterCard. The Consultant Engineer role is responsible for working with teams across the company to implement new data solutions while maintaining the stability of the platform. You will be responsible for assessing technologies and approaches for ingestion, transformation and storage. In addition, you’ll work within the Fraud Data Engineering team to grow their knowledge and expertise. You will get the chance to work with extremely large data sets and be on the cutting edge of transforming the way Mastercard captures, processes, stores and visualizes transactional data.

• Develop scalable streaming solutions based on Spark, Kafka and/or Flume
• Implement data models using NoSQL databases like Hbase, Hive
• Design, develop and implement data models with quality and integrity at the top of mind to support our products
• Create documentation to support knowledge sharing; including flowcharts and diagrams
• Develop software utilizing open source technologies to interface distributed and relational data solutions
• Work to establish a Hadoop Cluster architecture

• BS/BA degree in Computer Science, Information Systems or related field
• Hands-on experience using ETL technologies and/or Big Data technologies
• Understanding of Data architecture, replication, and administration
• Experience working with real-time or near real-time ingestion
• Strong backend experience using; Java, Scala, etc.
• Hands-on experience with streaming technologies like Kafka, Spark, Flume
• Experience working with NoSQL technologies like Hbase, Hive
• Experience working with Search technologies like Solr or ElasticSearch
• Deep understanding of high-performant data concepts such as file stores, wide column databases, key-value pairs, etc.
• Excellent oral and written communication
• Excellent problem-solving skills

• Agile/Scrum methodologies
•Experienceusing Cloudera stack