Search Jobs

Open Now! Technology Positions

Big Data Engineer - 1017871

Pittsburgh, PA

Employment Type: Direct Hire Industry: IT Job Number: 1017871 Pay Rate: Negotiable

Job Description


Lighthouse Technology Services is partnering with our client to fill their Big Data Engineer role! This is a direct hire role with remote flexibility. Occasional travel to Buffalo, New York required. This role will be responsible for building the data lake and related processes for data collection, storage, management and analysis! If you're interested in having a ton of ownership, helping to kick off this project from scratch and make decisions on what tools and technologies should be used, this is the role for you. The position reports to the director of software engineering, who's been with the company since it's inception and is a wealth of knowledge!




Position Overview

In this engineer/architect role, the ideal candidate will have a strong background in a heterogeneous data management environment and a passion for building software that collects, aggregates and analyzes data at scale. You will be expected to have extensive experience with data warehousing best practices and techniques and possess extensive knowledge, hands-on experience, and expertise with medium to large enterprise data warehousing environments. You will work within our software engineering, product management, and development engineering teams to shape our data strategy. 




What You'll Be Doing:
  • Anticipate and plan for the growth of the company's data platform infrastructure
  • Scope, design and implement platform solutions that make the appropriate tradeoffs between resiliency, durability, and performance
  • Help debug and solve critical infrastructure issues across our services and multiple levels of the stack
  • Participate in all phases of the Software Development Life cycle (SDLC)
  • Adhere to high-quality development principles while delivering projects with technical excellence
  • Identify bottlenecks and bugs, and devise solutions to mitigate and address these issues
  • Participate in peer-reviews of solution designs and related code. Package and support deployment of releases
  • Maintain high standards of software quality within the team by establishing good practices and habits
  • Develop documentation throughout the SDLC
  • Translate requirements into high-quality, testable, scalable software
  • Design, develop, and unit test applications in accordance with established standards
  • Assess opportunities for application and process improvement
  • Develop high-quality state-of-the-art algorithms




What You'll Need to Have:
  • In-depth understanding of data management (e.g. permissions, recovery, security and monitoring)
  • Strong understanding and practical experience in data platform fundamentals, including clustering, distributed systems, fault tolerance, networking, etc
  • Strong understanding and practical experience with systems such as Hadoop, Spark, Presto, Iceberg, and Airflow
  • Experience planning and driving large projects involving multiple stakeholders across an organization
  • Involvement with high scalability projects involving cloud-based infrastructure design and implementation
  • Experience with Microsoft development tools and services (Visual Studio and Azure DevOps is highly desired)
  • Experience with document-oriented databases and non relational databases
  • Strong relational database experience and SQL/stored procedure skills
  • Demonstrated ability to work cross-functionally to meet program requirements
  • Working knowledge of agile software development life-cycle
  • B.S. or M.S. in Computer Science or equivalent
  • 3-5+ years of experience working with Data warehouses or Databases
  • 3+ years strong hands-on experience using Cloud tools: Azure Databricks, Azure Delta lake, Azure Data Factory, SQL Server, Azure Synapse, Azure DevOps, Data Lake(ADLS), PySpark, Databricks Notebooks
  • 3+ year of experience in big data technologies such as Presto/Trino, Spark, Hadoop, Airflow, Kafka, Apache Flink, Druid, Apache Pinot 
  • Familiarity with ML.Net, TensorFlow or other similar deep net or machine learning packages
  • Experience with IoT communication protocols, web security, and software product lifecycle
Apply Online

Send an email reminder to:

Share This Job:

Related Jobs:

Login to save this search and get notified of similar positions.