Position Title: Data Architect
Position Number: 260903
Location: O Fallon, MO
Desired Skill Set:
Big Data, Data Architecture, ETL, Hive, Python, Scala, Spark
“U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are unable to sponsor candidates at this time.”
Title: Data Architect
Location: O’Fallon, MO
Duration: 12 months
Have you ever wanted to be a part of something BIG? Do you have experience integrating new and emerging technologies into existing environments as technologies evolve and demands change? Are you eager to work with open source software and be exposed to the fast-paced world of Big Data technology? In this role, you can make an immediate impact for a Leading Global Technology Company, client . The Consultant Engineer role is responsible for working with teams across the company to implement new data solutions while maintaining the stability of the platform. You will be responsible for assessing technologies and approaches for ingestion, transformation and storage. In addition, you'll work within the Fraud Data Engineering team to grow their knowledge and expertise. You will get the chance to work with extremely large data sets and be on the cutting edge of transforming the way client captures, processes, stores and visualizes transactional data.
• Design scalable streaming solutions based on Spark, Kafka and/or Flume
• Work closely with ETL technologies such as SyncSort, Informatica, etc.
• Work closely with Business Intelligence technologies such as Qlik, Tableau, etc.
• Work closely with team members from across client to identify functional and system requirements
• Design, develop and implement data models with quality and integrity at the top of mind to support our products
• Create documentation to support knowledge sharing; including flowcharts and diagrams
• Provide oversight and guidance to our Data Engineering development team
• Developing software utilizing open source technologies to interface distributed and relational data solutions
• Work to establish a Hadoop architecture
KNOWLEDGE AND SKILL REQUIREMENTS
• BS/BA degree in Computer Science, Information Systems or related field
• Experience using ETL technologies and/or Big Data technologies
• Deep understanding of Data architecture, replication, and administration
• Experience working with real-time or near real-time ingestion
• Strong backend experience using; Python, Scala, HiveQL, SparkSQL, etc.
• Deep understanding of high-performant data concepts such as file stores, wide column databases, key-value pairs, etc.
• Excellent oral and written communication
• Excellent problem-solving skills
OTHER VALUABLE SKILLS
• Team Lead Experience
• Experience working with the Cloudera stack; Kafka, Spark, Flume, Hadoop, etc
• Prior experience w/Business Intelligence Technologies
• Agile/Scrum methodologies
Send me a reminder to complete this application
Rose International is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, arrest and conviction records, or any other characteristic protected by law. Positions located in San Francisco, California will be administered in accordance with the Fair Chance Ordinance.