A minimum of three years’ experience is required in maintaining and modifying data standards, defining and enforcing data governance procedures and ensuring data integrity across multiple functions, publishing data quality reports and responsible for meeting data accuracy goals.
A minimum of three years’ experience is required in creating, implementing, and maintaining data integration tracking tool to coordinate, maintain, and manage all data integration projects and data quality efforts. Microsoft SharePoint and the National Data Exchange On-Boarding Activity Tracking System (NOBATS) tool knowledge is preferred.
Proficiency with Web Services: Simple Object Access Protocol (SOAP) and Representational State Transfer Protocols (REST).
Possess proficiency with Information Exchange Package Documentation (IEPDs) and law enforcement data specifications: Incident and Arrest, Incarceration Booking Probation and Parole.
A minimum of five years’ experience with programming languages is required. JAVA, C, C#, and .NET is desired.
A minimum of five years’ experience with databases is required. Oracle, Microsoft Access, MySQL, Sybase, Firebird database software, c-treeACE, SQL Developer, and relational databases.
A minimum of three years’ experience is required with transforming languages and schemas: XML, XSLT, XSD, DTD.
Leading data integrity and awareness, implementing best practices and lessons learned for data reporting, data management and data security.
Shall collaborate with multiple teams including development, operations, and security teams as well as end-user customers.
Shall apply various optimization techniques in order to produce automated data solutions for large-scale optimization of data management and analytical problems; techniques may include programming including event simulation, dynamic programming.
Shall create and maintain predictive data and analytical models using machine learning, natural language, and statistical analysis methods such as classification, time-series analysis, regression, statistical inference, and validation tools; perform exploratory data analyses, generate and test working hypotheses, prepare and analyze historical data, and identify patterns.
Implement findings and make recommendations on the entire data lifecycle process.
Work directly with customers and stakeholders to present and articulate the techniques and major results generated using non-technical language for data analysis and management.
Create innovative, repeatable, business use cases for real world optimization techniques, and quickly develop prototypes to test use cases.
Work with Virtual Machines (VM) and Virtual Desktop Infrastructures (VDI); potentially work with cloud environments, including Amazon Web Services (AWS) and Microsoft Azure framework.
Propose, create, and maintain data processing and automation projects to management based on subject matter expertise in the available data and system architecture.
A minimum of three years’ experience in using data standards and validation tools. National Information Exchange Model (NIEM), Logical Entity eXchange Specifications (LEXS), IEPDs, Conformance Testing Assistant (ConTesA), XCOTA are preferred.
A minimum of three years’ using Secure File Transfer Protocol (SFTP) for data submissions is preferred.
A minimum of five years’ experience with Software Procedural Languages: PL/SQL code, T-SQL code is desired.