Data Solutions Engineer (Hadoop)

Franklin, TN 37067

Posted: 10/15/2018 Category: Big Data / Database / BI Job Number: 7761
Overview:   This role is a part of the  Data Science & Engineering Competency Center that enables big data & analytics solutions for the enterprise, working with our most progressive business areas to collect, analyze, and visualize large sets of data using multiple platforms with an emphasis on innovation. This technical role requires advanced expertise in a broad range of software development tools & technologies and the ability to design and code strategic, high performance solutions that accelerate business value through data.   Responsibilities:
  • Architect, develop & support Big Data and advanced analytical solutions with an emphasis on strategic implementations and the overall Big Data framework
  • Promote enterprise standard technologies, contribute to the Big Data working group, evaluate & recommend emerging technologies, and collaborate with key stakeholders on innovation opportunities
  • Translate complex functional & technical requirements into detailed architecture, design, and high performing software solutions
  • Work on multiple complex projects as a technical lead/SME and oversee other Data Engineer designs
  • Code, test, and implement data & analytics solutions in alignment with the project schedule(s)
  • Create data flow diagrams and other living documents to support the data solutions, while also advising & coaching other developers to ensure consistency
  • Expand enterprise  data product catalog in Big Data Environment and expand the data platform capabilities to solve new data problems and challenges
  • Perform logical & physical database design for Big Data solutions, construct appropriate data flow, and follow enterprise  standards
  • Ensure effective automated processes, high data availability, and operationalization of all products
  • Oversee the enterprise data catalog components & metadata
  • Recommend and advise on all Big Data components, roadmap, and emerging opportunities
  • Contribute to standards development & data governance
Job Knowledge and Skills:
  • Advanced knowledge of data management concepts of data warehousing, ETL, data integration, etc.
  • Experience with agile/scrum or other rapid application development methods
  • Demonstrable experience with object-oriented design, coding and testing patterns as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructures
  • Advanced understanding of network configuration, devices, protocols, speeds and optimizations
  • Strong understanding of the Java ecosystem and enterprise offerings
  • Ability to understand big data use-cases, and recommend standard design patterns commonly used in Hadoop-based deployments
Education/Experience Desired:
  • Bachelor' s degree in computer science, computer engineering, other technical discipline, or equivalent work experience
  • 7-10 years of professional IS/IT experience overall
  • Over 5 years of large-scale software development and integration via data engineering, data science, or software engineering with a concentration on Big Data
  • 3-5 years of demonstrated experience leading teams of data/software engineers and/or data scientists
  • Capability to architect highly scalable & complex systems, using different open source tools
  • Direct, hands-on design, development, deployment & support of software solutions with a recent emphasis on Hadoop solutions
  • Hortonworks HDP Certified Developer, HDP Certified Spark Developer credential
  • Experience designing data queries against data in the HDFS environment (Hive & HBase)
  • Seasoned developer using different programming languages (ex. Java) and scripting tools such as bash shell scripts, Python/PySpark, and/or Perl
  • Experience with R
  • Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP
  • Solid background in database design, modeling, and data integration on a variety of relational databases (DB2, Oracle, SQLServer, Postgres, etc.) and NoSQL databases
  • Advanced knowledge & experience with all aspects of Hadoop ecosystem (Pig, Hive, Oozie, Kafka, Hue, Spark, Zeppelin, Atlas, Solr, LLAP, etc.)
  • Messaging technologies (MQ, ActiveMQ, etc.)
  • Nifi experience preferable

Clay Haywood

Apply Online
Apply with LinkedIn Apply with Facebook Apply with Twitter

Send an email reminder to:

Share This Job:

Related Jobs:

Login to save this search and get notified of similar positions.