Hadoop Subject Matter Expert

State Farm Insurance Cos   •  

Tempe, AZ

Industry: Accounting, Finance & Insurance

  •  

Less than 5 years

Posted 34 days ago

Duties and Responsibilities:

  • Provisions and configures compute resources across the cloud or non-cloud landscape. Develops and implements infrastructure solutions to ensure stable and available Open Systems computing environments. Integrates and manages the Open Systems platforms with other infrastructure technologies and services into a heterogeneous environment. Performs planning, consulting,
    technical analysis, designing, development and deployment activities related to Open Systems. Maintains infrastructure integrity through research and deployment of upgrades and patches. Researches Open System software, tools and emerging technology. Develops and implements policies, strategies, guidelines, standards and procedures related to Open Systems. Develops processes and tools for managing the Open Systems infrastructure. Provides services to proactively plan capacity along with proactive and reactive performance and availability for the mainframe, open systems, storage and network environments and the applications that reside on them.
  • Ensures that the technical infrastructure is sufficiently robust, scalable, and efficient to deliver the integrated services underlying the physical environment that supports the processes, resources, and operators required for developing, integrating, sustaining enterprise applications and support services
  • Addresses the day-to-day management and maintenance of infrastructure services, systems, applications, and hardware usage
  • Supports architecture, preliminary design, implementation, and infrastructure operations
  • Develops infrastructure strategy and operational policies, standards, and processes tailored to agency or department missions
  • Develops asset managementprocesses that support the provisioning, tracking, reporting, ownership, risk, controls, and financial status of Enterprise Technology assets
  • Enables service desk, help desk, and call center development, implementation, operations, and process improvement
  • Ensures service level management through the development of processes, people, technology, and service level and operating level agreements
  • Utilizes strong business and technical knowledge of platforms/operating systems to develop and support data, databases or products
  • May be responsible for the implementation and ongoing administration of the Big Data ecosystem
  • Understands and uses software development practices to manage configuration and automate provisioning
  • Ensures infrastructure and operations security, such as network and application firewalls, authentication, identity management, and intrusion detection and prevention
  • Based on assignment, requires competencies in data center operations, infrastructure platforms, and service delivery. Technical specialties include local- and wide-area network design, servers, storage, backup, disaster recovery, continuity of operation, performance monitoring, virtualization, cloud computing, modeling, visualization, and other emerging technologies

Additional Details:

State Farm is adding to teams responsible for the enterprise Hadoop platform with a focus of making the platform easily consumable. We need analysts well versed in the Hadoop ecosystem and application development who can effectively guide consumers in the design and development of new applications used in operations, analytics, data science, and machine learning.

At the heart of this team, we’re looking for people who are innovative, self-motivated and enjoy working in an agile, collaborative environment.

If you’re looking for an organization striving to harness the latest technologies for competitive advantage, then State Farm is for you!

Required Technology Experience (3+ years of Professional Exp.):

  • Must have professional hands on experience in a Hadoop/Linux environment
  • Experience required with these core Hadoop projects: HDFS, Hive, Impala, MapReduce, Oozie
  • Experience with additional Hadoop projects such as ZooKeeper, HBase, and Spark is a bonus.
  • Object-Oriented language background such as Java, Scala, or Python
  • General Linux knowledge, ability to execute Linux commands via Putty, comfortable with command line Unix/Linux usage

Other Technology Experience or Willingness to Learn:

  • Data movement as it relates to migration as well as disaster recovery (active/passive).
  • Design principles and factors that impact system functionality and performance
  • Ability to configure Kafka/Flume; Knowledge of Maven/Jenkins/GitLab/UrbanCode for deployment
  • Data design principles
  • Kerberos/encryption/LDAP
  • General understanding of networking

Soft Skills:

  • Strong communication and relationship building skills to positively interact with business partners and internal team members
  • Demonstrated results in a fast-paced environment
  • Ability to quickly adapt to a changing environment
  • Ability to foster innovation, encourage diversity of thought, and incorporate new ideas
  • Knowledge of or experience with SAFe (Scaled Agile Framework)
  • req4144