Data Developer 2 - RT

  •  

Lafayette, CO

Industry: Information Services

  •  

Less than 5 years

Posted 328 days ago

  by    Kevin Meyers

Data Developer2 - RT

Southfield, Michigan 48075

Lafayette, Colorado 80026

Our first preference on location is Southfield, MI but we'll also consider Lafayette, CO.

There are no visa candidates to be considered for this position.

This is a mid-level position, but we're open to junior level folks between 1-3years of experience. All of the technologies this group works with are proprietary.

Big data processing

Hadoop (our primary database is proprietary)

ETL experience (ours is proprietary)

programming experience

About the Opportunity

The Developer II (Data Analysis) works with the Lead Developer to design and develop technical solutions on a proprietary, multi-node computing environment focused on 'big data' processing (billions of records).  

This requires full SDLC familiarity, including requirements review, analysis, design, programming, testing and implementation as assigned.

Responsibilities

  • Verify input data quality, including identification and communication of file variances and potential issues
  • Read data file dumps and QC conversion output.
  • Set up, process and QC data hygiene and merge/purge processes
  • Set up and process statistical reports and provide these reports to account teams.
  • Maintain and adhere to project schedules.

Required Knowledge, Skills, and Abilities: (Submission Summary)

  1. Bachelor’s degree in Computer Science or related field or equivalent years of experiencerequired
  2. 1-3years of development experiencerequired
  3. Proven analysis, design, programming and implementation experience is required
  4. Experience manipulating and joining multiple, large data files (Millions of records, 100’s of GB) required.
  5. Knowledge of data formats – character fields, packed fields, binary, hexadecimal fields required.
  6. Experiencewith flat files, Ascii/Ebcdic, binary data, bitmapping, and ETL processing fundamental.
  7. Knowledge and understanding of HADOOP, cloud computing, massively parallel-processing systems strongly preferred.
  8. Knowledge of UNIX environment/directory structures /Linux/DO environment is helpful
  9. Experiencewith data compression algorithms and techniques preferred.
  10. Development experiencewith C, C++, Perl, Shell Scriptingpreferred
  11. Experiencewithdatabase programs and basic SQL Knowledge and preferred
  12. Familiarity withAgile Development concepts, JIRA, and GIT
  13. Ability to solve problems without using traditional tools, software, or relational databases.
  14. Knowledge of data hygiene, address standardization, etc is a plus.
  15. Strong written and oral communications skills
  16. Ability to read, create, and change existing code
  17. Ability to coordinate and follow up with multiple tasks
  18. Attention to detail necessary
  19. Ability to recommend and implement automation solutions for production process.

  20. Big data processing experience

  21. Hadoop (our primary database is proprietary) experience

  22. ETL experience (ours is proprietary)

 23. Programming experience

 24.  Present Salary?

 25.   Salary Expectation?

 26.   Must be a US Citizen or Green Card holder?

 27.   Current Address?