Staff Data Engineer

Buffalo, NY, USA ● Virtual Req #3596
January 14, 2025

Who we are looking for:

 

The data engineering team's mission is to enhance the vehicle decoding accuracy, and provide high availability and high resiliency as a core service to our ACV applications. Additionally, the team is responsible for database to database ETL’s using different ingestion techniques.  We are responsible for a range of critical tasks aimed at ensuring smooth and efficient functioning and high availability of ACVs data platforms.  We are a crucial bridge between Infrastructure Operations, Data Infrastructure, Analytics, and Development teams providing valuable feedback and insights to continuously improve platform reliability, functionality, and overall performance.

 

We are seeking a talented data professional as a Staff Data Engineer to join our Data Engineering team. This role requires a strong focus and experience in software development, multi-cloud based technologies, in memory data stores, and a strong desire to learn complex systems and new technologies.  It requires a sound foundation in database and infrastructure architecture, deep technical knowledge, software development, excellent communication skills, and an action-based philosophy to solve hard software engineering problems.  

 

What you will do:

 

As part of the Data Engineering team you will be responsible for Python development for API and ETLs, application architecture, optimizing SQL queries, collaboration with teams on database and development support, and designing and developing scalable data services.   

 

As a Staff Data Engineer at ACV Auctions you will design, develop, write, and modify code.  You will work alongside other data engineers and data scientists in the design and development of solutions to ACV’s most complex software problems.  It is expected that you will be able to operate in a high performing team, that you can balance high quality delivery with customer focus, and that you will have a record of delivering and guiding team members in a fast-paced environment.  You will be a leader and mentor for more junior engineers on the team.

 

  • Actively and consistently support all efforts to simplify and enhance the customer experience.

  • Design, develop, maintain code, and support for our web-based applications and ETLs using Python Fastapi and Python.

  • Develop complex data models using common patterns like EAV, normal forms, append only, event sourced, or graphs.

  • Support multi-cloud application development.

  • Design and build complex systems that can scale rapidly with little maintenance.

  • Design and implement effective service/product interfaces.

  • Contribute, influence, and set standards for all technical aspects of a product or service including but not limited to, testing, debugging, performance, and languages.

  • Support development stages for application development and data science teams, emphasizing in Postgres database development.

  • Be an influencer of the team designs and direction of our owned applications

  • Actively seek new or additional technologies to improve the data layer of our application

  • Influence company wide engineering standards for tooling, languages, and build systems.

  • Leverage monitoring tools to ensure high performance and availability; work with operations and engineering to improve as required.

  • Ensure that data development meets company standards for readability, reliability, and performance. 

  • Collaborate with internal teams on transactional and analytical schema design.

  • Conduct code reviews, develop high-quality documentation, and build robust test suites.

  • Respond-to and troubleshoot highly complex problems quickly, efficiently, and effectively. This may include being part of the emergency after-hours on-call rotation.

  • Mentor junior data engineers.

  • Lead technical discussions/innovation including engineering tech talks

  • Lead in engineering innovations including discovery of new technologies, implementation strategies, and architectural improvements.

  • Perform additional duties as assigned

 

What you will need:

 

  • Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience)

  • Ability to read, write, speak, and understand English.  

  • 8+ years of experience programming, building & supporting SaaS web applications

  • 5+ years of experience programming in Python Fastapi

  • 5+ years of experience with ETL workflow implementation (Airflow, Python)

  • 5+ years work with continuous integration and build tools. 

  • 5+ years of experience with Cloud platforms preferably in AWS or GCP

  • Deep Knowledge in day-day tools and how they work including deployments, k8s, monitoring systems, and testing tools.

  • Highly proficient in version control systems including trunk-based development, multiple release planning, cherry picking, and rebase.

  • Proficient in databases (RDB), SQL, and can contribute to table definitions.

  • Self-sufficient debugger who can identify and solve complex problems in code.

  • Deep understanding of major data structures (arrays, dictionaries, strings).

  • Expert experience with Domain Driven Design.

  • Experience with containers and Kubernetes.

  • Expert experience with database monitoring and diagnostic tools, preferably Data Dog.

  • Strong proficiency in SQL query writing and optimization.

  • Advanced experience with database security principles and best practices.

  • Experience with in-memory data processing.

  • Advanced knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks.

  • Hands-on skills and the ability to drill deep into the complex system design and implementation.

  • Hands-on experience with Kafka or other event streaming technologies.

  • Experience with Airflow, Visual Studio, Pycharm, Redis, Fivetran.

  • Strong communication and collaboration skills, with the ability to work effectively in a fast paced global team environment.

  • Experience working with:

    • SQL data-layer development experience; OLTP schema design

    • Using and integrating with cloud services, specifically: AWS RDS, Aurora, S3, GCP

    • Github, Jenkins, Python, Docker, Kubernetes  #LI-AM1 

 

Compensation: $155,000.00 - $211,000.00 annually. Please note that final compensation will be determined based upon the applicant's relevant experience, skillset, location, business needs, market demands, and other factors as permitted by law. .  #LI-AM1

 

No immigration or work visa sponsorship will be provided for this position

Other details

  • Job Family Product & Technology
  • Job Function Engineering
  • Pay Type Salary
Location on Google Maps
  • Buffalo, NY, USA
  • Virtual