Inovalon is a leading technology company that combines advanced cloud-based data analytics and data-driven intervention platforms to achieve meaningful insight and impact in clinical and quality outcomes, utilization, and financial performance across the healthcare landscape. Inovalon's unique achievement of value is delivered through the effective progression of Turning Data into Insight, and Insight into Action.
Inovalon is looking for Senior Big Data Administrator in Bowie, MD. In this role, the Big Data admin is responsible for providing 24/7 operations support for Inovalon mission critical Hadoop, MongoDB and Greenplum Databases.
- Provided 24x7 operation support for large scale Hadoop and MongoDB clusters across production, UAT and development environments;
- Serve in an on-call rotation as an escalation contact for critical production issues;
- Experienced in building or administrating Big Data clusters with HDFS, Kafka, Zookeeper, Hive, Yarn, Hue, Oozie, etc.;
- Knowledge or experience in supporting NoSQL Databases like HBase, Hive and MongoDB;
- Manage replication links between clusters to maintain high availability;
- Configure and Monitor MongoDB instances and replica sets;
- Analyzing and debugging slow-running development, performance, and production jobs;
- Ensure all databases are backed up to meet the business’s Recovery Point Objectives (RPO);
- Monitor Ambari and Ops Manaher dashboard and troubleshoot and resolve hadoop, MongoDB issues;
- Work on documenting database environments and standard operating procedures;
- Ensure that all big data components has latest patches and correct versions of supporting tools;
- Working with the vendor(s) and user communities to research and test new technologies to improve the technical capabilities of existing Hadoop clusters;
- Execute capacity planning and monitoring database growth;
- Assist development teams with big data related topics; and
- Build domain expertise and cross-train team members.
- 8-10 years of IT experience supporing any relational database databases preferabilly MS SQL;
- 3-5 years’ experience installing and configuring Hortonworks Hadoop Clusters that includes a combination of the following: Backup and recovery of HDFS File Systems (distributed filesystem java based);
- MySQL databases used by Cluster Configuring and maintenance of HDFS, YARN Resource Manager, MapReduce, Hive, HBASE, Kafka, or Spark;
- String understanding and experience of ODBC/JDBC with various clients like Tableau, Microstrategy, and server components;
- Monitoring and tune cluster component performance.