Sr Big Data ETL Developer

Pragmatics   •  

Washington, DC

Industry: Professional, Scientific & Technical Services


11 - 15 years

Posted 172 days ago

This job is no longer available.

Job Description  

Would you like to perform rewarding work while contributing to the success of an established, growing company ?   Pragmatics, Inc. seeks a Sr. Big Data ETL Developer to support a Federal Financial client in a fast-paced data-driven environment.

As a Sr. Big Data ETL Developer, you will lead a team of highly skilled developers and engineers in the design and development of complex, risk-based system components that will drive business and economic value to the entire federal financial and securities based market in the United States.  Pragmatics Inc. seeks to support the SEC as it endures to protect investors, maintain fair, orderly, and efficient markets, and facilitate capital formation.  Our dynamic program team of IT leaders and small business partners is seeking bright, energetic and talented individuals to join us as we bring our innovative and deep federal financialexperience to the SEC’s Division of Economic and Risk Analysis.

List duties

  • Acquires knowledge of our multiple types of operations data with the desire to understand how the information is used by the SEC.
  • Work in conjunction withBusiness Analyst, DBA, Data Architects on the backend data warehouse reporting solution.
  • Manage and perform data cleansing, de-duplication and harmonization of data received from, and potentially used by, multiple systems.
  • Actively engage in building robust ETL solution using to harness the best practices.
  • Engage in maintaining and troubleshooting daily data loads and addressing any issues.

Required skills

  • Experience with modeling tools, including widely-used Hadoop tools, such as Apache Hive, Ozzie, Pig Impala, and BigSQL, knowledge of cloud-based Hadoop and Data Warehouse Technologies, such as Amazon EMR, Redshift and DCOS, and knowledge of Hadoop metadata management tools like HCatalog
  • Shall have experience with software version control, relational databases such as Oracle, SQL Server and PostgreSQL, MPP databaseexperience, e.g., Redshift and Netezza, and familiarity with API design principles and best practices
  • Shall have hands-on experience designing and delivering Hadoop-based big data platform solutions, and be a certified developer from a major Hadoop Distributor

desired skills

  • Excellent documentation and communication skills

education and years of experience

  • Shall have a minimum of a Bachelor’s degree in computer science, or a related field, with at least 10 years of IT experience
  • Shall have a minimum of ten years of experience and expertise big data environments, including Hadoop and Netezza, both on premises and cloud-based


  •  None
  • Must be US Citizen