AWS Solutions Engineer ( Cloudera Hadoop expertise )

Confidential Company  •  Tampa, FL and Warren, NJ

5 - 7 years experience  • 

Salary depends on experience
Posted on 05/18/18
Confidential Company
Tampa, FL
5 - 7 years experience
Salary depends on experience
Posted on 05/18/18

Functions  Consulting, Healthcare, Finance, Information Technology, Project Management, Program Management
Industries  Information Technology and Services, Healthcare, Capital Markets, Investment Banking, Alternative Investments, Financial Services, Management Consulting
 
Requirements

  • Bachelor’s Degree or equivalent in computer science or related and 5+ years of experience
  • Require AWS IAM setup expertise in order to delegate responsibilities to authorized personnel in an organization
  • Require expertise on AWS VPCs, subnets, Route Tables
  • Require proficiency on AWS EC2 instance with AutoScaling configuration and Containerization
  • Must have expertise on AWS EMR setups with automation
  • Ability to develop AWS Lambda functions with preferred integration with Kinesis streams and analytics
  • Ability to create AWS RDS instances for persisting application or process orchestration metadata
  • Ability to install and administer Cloudera distribution of Hadoop (HDFS, Hive, Yarn, Spark)
  • Ability to use Cloudera Management Services and upgrade and deploy patches on cluster
  • Ability to perform basic RedHat Linux administrative tasks i.e. install RPM using YUM repositories,
  • Expand existing and add new EBS devices as filesystems into EC2 nodes
  • Ability to setup additional monitoring CloudWatch scripts required to monitor the cluster and its services
  • Financial services and Healthcare industry experience preferred

Job Duties

  • Create AWS EMR clusters with scripted automation, based out of AWS CLI or SDKs (python or java)
  • Create AWS VPC ecosystem and launch EC2 instances with CLI configuration
  • Build Containerized applications on a large cluster using AWS ECS
  • Develop serverless AWS Lambda trigger-based functions (using Python or Java SDKs)
  • Create AWS Kinesis Firehose streams and integrate with Data Analytics interface
  • Create AWS Redshift database and integrate with external applications
  • Create AWS Desktop Workspaces for teams within an organization
  • Aligning with the systems engineering team to propose and deploy new hardware and software environments required for AWS ecosystem along with organization network security requirements
  • Create IAM roles and groups with inline policies, and if required create custom policies, and attach to users of different profiles (administrators, developers, invoicing and billing teams etc)
  • Develop Scripting for Automation in Jenkins (Terraform , Ansible , CFT )
  • Setting up Kerberos principals on Cloudera Hadoop Cluster and testing HDFS, Hive, Pig and MapReduce access for the new users
  • Create automation for Log file management leveraging AWS S3
  • Create DevOps environment
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability
  • Collaborating with application teams to install operating system and Hadoop and OS updates, patches, version upgrades when required
  • Estimate usage of Hadoop cluster and create Yarn resource pools on Cloudera Manager
Not the right job?
Join Ladders to find it.
With a free Ladders account, you can find the best jobs for you and be found by over 20,0000 recruiters.