The Sr. Software Engineer supports the Johns Hopkins in Health Precision Medicine initiative. The position has several responsibilities, including building ingest of data from source systems to the Precision Medicine data lake and transforming data to serve a number of different data consumers, including researchers for COVID-19. The Sr. Software Engineer reports to a Johns Hopkins IT Health Data Integration Services manager but is also expected to work closely with the Technology Innovation Center. This position plays an important role in helping Johns Hopkins leverage Electronic Medical Record and other data assets for clinical research and for the development of clinical and patient facing applications. It is important for this role to understand the information and integration needs, source systems, quality standards, report and analytic requirements, and electronic medical record data to support centralized and decentralized development of institutional and business intelligence content.
Knowledge, Skills, & Abilities (KSA’s):
- Requires a thorough knowledge of data modeling, data management, and database development. Experience working in these areas in health system operations is preferred.
- Requires strong technical knowledge of data integration tools, data modeling tools, metadata tools, and database design. Understands the range of options and best practices for common ETL design techniques such as change data capture, key generation and optimization, and performance tuning.
- Requires strong proficiency in SQL programming, query writing, query performance tuning, and database technologies.
- Requires experience with all phases of the Software Development Life Cycle (SDLC).
- Requires analytical ability to solve complex technical problems and participate as part of a diversified staff in matrix-managed groups.
- Requires ability to articulate technical and organizational approaches to meet clinical and research data requirements.
- Requires ability to convey technical methods, approaches, and plans to an audience of varying degrees of technical understanding including peers and customers.
- Knowledge of ETL processes, cloud-based storage and data management technologies, such as Azure Databricks and Azure Data Factory is preferred.
- Preferred Experience with the following databases, workbenches, and codebases: SQL Server, SQL Server Management Studio, SSIS, Java, C++, PHP, Python, IIS, SharePoint and R.
- Technical certification in Epic Clarity and/or Caboodle is preferred.
- Candidate must be detail oriented and have the ability to work on multiple priorities effectively and prioritize conflicting demands.
The primary responsibility of the Sr. Software Engineer is to work on data extraction, transformation and load to support research and data visualization projects. This may involve extending the data lake with additional data to meet business needs. To achieve this the Data Engineer will:
DESIGN, DEVELOPMENT, and DEPLOYMENT
- Implement data ingest pipelines and transformations into the Precision Medicine Analytics Platform data lake;
- Transform data to comply with standards such as the OMOP Common Data Model;
- Extract and transform data to comply with researcher requirements;
- Extract and transform data to comply with requirements from data visualization applications;
- Deliver data to research groups using the Precision Medicine Analytics Platform;
- Design and deliver guidelines that allow a broad base of internal and external developers to build content sourced in the data lake while adhering to established standards and strategy;
- Develop and validate the creation and maintenance of enterprise data definitions and metadata;
- Serve as a specialist/consultant on complex projects;
- Formulate and articulate plans for data or infrastructure architecture as needed;
- Participate in project planning to ensure effective use of technology and/or business process to meet customers’ needs;
- Track relevant cloud technologies to:
- Determine their maturity and applicability to the enterprise;
- Assess the relative impact to IT strategy and interpret meaning to senior IT leadership team;
- Lead and manage strategic activities, including adoption of cloud services and continuous integration strategies;
IMPLEMENTATION AND MAINTENANCE
- Monitor changes and resolve highly complex problems by responding as problems arise; Accomplish this by reviewing all processing and output of the newly implemented solution, and by proactively ensuring the solution works successfully in order to satisfy customer requirements and to provide a smooth transition to the new solution;
- Oversee changes by adhering to the change management policies and procedures for any given project to communicate to all parties the nature, significance, and risk factors of the solution;
- Conform to data policies, governance structures and control frameworks for the ongoing management of data to ensure convergence, synchronization, accuracy, completeness and reliability;
- Make recommendations for optimization and performance improvements to database design, as well as improvements to ETL processes and tuning of software and hardware for existing and projected needs;
- Work with source system owners to establish and measure data quality metrics, investigate and resolve source data issues, and continuously evaluate and refine transformation rules;
- Define roles and responsibilities for centralized and decentralized data management functions.
- Establish and monitor project and task schedules and ensure adherence to work deadlines;
- Contribute to staff evaluations.
Bachelor’s degree required. Additional experience may be substituted for education.
JHH: Related experience beyond minimum training experience qualifications may substitute for formal education requirement on a two years experience-for-one year education basis
Six years of related work experience with computer systems and applications.
Technical certifications in Epic Clarity or Caboodle, preferred.
Preferred Job Qualifications:
Knowledge in the assigned application as well as the platform on which it runs. Requires demonstrated experience in developing data models and data architecture. Experience with cloud-based data transformation technologies, preferred.