$120K — $130K
Position: Infrastructure Engineer Consultant
Required : Hadoop, hive, spark, pig, storm, Kafka, Nifi, Atlas, Elastic Search, Solr, Splunk, HBase applications, Ambari, Hortonworks, HDInsight, Cloudera distribution (CDH), Kerberos, TLS encryption, SAML, LDAP
Duration: Full time
Required Citizenship / Work Permit / Visa Status: US citizen or green card
Location: Texas, Dallas
Local candidates are preferred, positions are open in Dallas and Chicago. If the candidate is from out of town then he or she will have to pay for their relocation.
Job Purpose: This position is responsible for collaborating with Solutions Engineering, Infrastructure Operations, and Infrastructure Service Management teams in the design and build of infrastructure - solutions/blueprints for the area of responsibility; participating in the design and build of repeatable patterns (build-kits) to improve deployment times for non-prod and prod environments; transitioning knowledge to Infrastructure Operations.
Required Job Qualifications:
Bachelor's Degree and 5 years in Information Technology or relevant experience OR Technical Certification and/or College Courses and
7-year Information Technology experience OR 9 years of Information Technology experience.
Expert in implementing and troubleshooting hive, spark, pig, storm, Kafka, Nifi, Atlas, Elastic Search, Solr, Splunk, HBase applications.
Candidate should be ready for Hadoop on-call support if and when needed.
Working knowledge of Ruby or Python and known DevOps tools like Git and GitHub.
Experience in a scripting language to automate Infrastructure deployments tasks
Knowledge/intermediate level experience in Cloud technology(Azure/Aws)
Ability to simplify & standardize complex concepts / processes
Understanding of business priorities (e.g., vision), trends (e.g., industry knowledge) and markets (e.g., existing/ planned)
Oral & written communications.
Ability to prioritize and make trade-off decisions.
Drive cross-functional execution.
Adaptability and ability to introduce/manage change.
Teamwork and collaboration.
Organized and detail-oriented.
Preferred Job Qualifications:
Experience in Hadoop application infrastructure engineering and development methodology background.
Experience with Ambari, Hortonworks, HDInsight, Cloudera distribution (CDH)
Experience with Kerberos, TLS encryption, SAML, LDAP.
Knowledge of the cloud (Azure/AWS) big data solutions using EMR, HDInsight, Kinesis, Azure Event Hubs, etc
Valid through: 11/26/2020