Sr Solutions Architect

Paxata   •  

Columbus, OH

Industry: Software


5 - 7 years

Posted 384 days ago

About Paxata

Companies around the globe rely on Paxata to get smart about information. Paxata is the pioneer that intelligently empowers all business consumers to transform raw data into ready information, instantly and automatically, with an enterprise-grade, self-service data preparation application and machine learning platform. Our Adaptive Information Platform weaves data into an information fabric from any source and any cloud to create trusted insights. Business consumers use clicks, not code to achieve results in minutes, not months. With Paxata, Be an Information Inspired Business.

What you would be doing

    • Participate in both the pre and post salesprocesses, assisting sales and services to analyze and architect how Paxata fitsinto a complex Hadoop® ecosystem
    • Engage with IT and Business executives of current customers for in-depth discussions about:
    • 1) Paxata's compatibility with any new or existing platforms
    • 2) How Paxata works with their enterprise Big Data strategies (including“classic” relational, warehouse, and NoSQL)
    • Work with prospective or current IT customers in Paxata installation planning and administration, including:
    • 1) Project Management to drive customer success use-cases and renewals.
    • 2) Installation / upgrade procedures, proper hardware sizing /cluster configuration, and
    • 3) Performance tuning and troubleshooting.
    • Build and deliver architecture presentations and / or whiteboard sessions to end-users and prospects
    • Work closely with engineering and product management to document and build architecture planning and installation best practices, including technical collateral and guides
    • Serve as interface with Cloudera(corporate partner) on overlapping installations


    • Ability to understand “Big Data” use cases , and develop business outcome driven use-cases and what it means for both compute and storage resources in order to recommend the appropriate server and storage required in a Hadoop deployment 
    • Hands-on with “Big Data” technologies with the Hadoop stack (e.g. Spark,MapReduce, Hive, Streaming) and practical experience that allows you distinguish between the implementation reality and the hype
    • Strong experience implementing application software in an enterprise Linux environment 
    • Good understanding of RDBMS,JDBC/ODBC, Integration Technologies, ETL and System Architecture 
    • Minimum of a Bachelor’s degree in Computer Science, Software Engineering, Information Systems or related field; or equivalent experience
    • Minimum 5 years of client-facing experience in technical consulting or data analytics professional services
    • Strong oral and written communication skills; excellent presentation skills

Bonus (“Nice-to-Haves”)

    • Background in vertical industries such as financial services, government, CPG, and/or high-tech manufacturing 
    • Experience with DQ and MDM 
    • Familiarity with analytics applications, such as Tableau, QlikView, Spotfire, GoodData, and/orMicroStrategy 
    • Experience working with Apache Spark.