Python Snowflake Data Engineer - ONSITE

Apply now »

Date: Mar 27, 2026

Location: Buffalo, NY, US

Company: NTT DATA Services

Req ID: 365000 

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.

We are currently seeking a Python Snowflake Data Engineer - ONSITE to join our team in Buffalo, New York (US-NY), United States (US).

The Technical Data Engineer will design, develop, and maintain enterprise-grade data solutions that support regulatory and risk reporting requirements. This includes analyzing centralized enterprise data in Snowflake, building Python-based microservices and Flask APIs for data movement, implementing business and computational rules, and constructing data models in PostgreSQL to support Enterprise Risk reporting and FED submissions using Power BI (PBI)

Key Responsibilities

Data Analysis & Source System Expertise

  • Analyze and profile enterprise data stored in Snowflake, understanding structures, lineage, and relationships.
  • Perform data validations, quality checks, and metadata reviews.
  • Translate business and regulatory reporting requirements into technical data specifications.

Data Pipeline Development (Snowflake  ARK  PostgreSQL) & Data Processing

  • Strong SQL and data analysis skills.
  • Proficiency with PostgreSQL schema design, stored procedures, query optimization.
  • Expertise in ETL/ELT pipelines, data transformation, and rule-based data computation.
  • Hands-on experience building Python microservices (Flask preferred).
  • Build and maintain Python-based microservices to orchestrate and automate ELT/ETL workflows.
  • Develop Flask-based REST APIs to extract, transform, and deliver data between Snowflake, ARK databases, and downstream systems.
  • Implement complex business rules, transformation logic, data calculation formulas, and FED-reporting computations.
  • Optimize pipelines for scalability, resilience, and performance.

API & Microservices Development.

  • Design, implement, and deploy Python microservices to support data ingestion and enrichment.
  • Build secure Flask APIs to expose and consume data services for ARK and reporting systems.
  • Implement authentication, authorization, error handling, and logging within APIs and services.
  • Integrate CI/CD pipelines for automated build, test, and deployment.

 

Data Modeling & Reporting Layer Engineering (Power BI)

  • Design and build PostgreSQL data models optimized for analytics and regulatory reporting.
  • Create schemas, tables, stored procedures, indexes, and reporting-optimized structures.
  • Support dashboards and reporting modules used for Enterprise Risk reporting to the FED.
  • Develop computation logic for enterprise risk metrics, aggregation layers, time-series calculations, and regulatory formulas.
  • Understand the business requirements and translate them into a reporting data model suitable for Power BI.
  • Write optimized PostgreSQL queries, views, or stored logic to build a complex dataset for efficient report consumption.
  • Build a clean semantic model in Power BI with proper relationships, hierarchies, and DAX measures.
  • Ensure data quality, validate business rules, and manage complex joins in the dataset.
  • Optimize report performance through query tuning, model simplification, and efficient PBI design patterns.
  • Coordinate data refresh scheduling, troubleshoot errors, and ensure the reports run from end‑to‑end.

AI Tools Utilization & Leveraging AI Platforms 

  • Leverage AI tools (Claude, Microsoft Foundry, Copilot, etc.) to accelerate solution design, documentation, and code generation.
  • Use AI-assisted data analysis to explore datasets, identify patterns, and derive insights that support reporting needs.
  • Develop and refine prompts to ensure accurate outputs, and validate AI-generated content for quality, reliability, and compliance.
  • Integrate AI-assisted workflows into development processes (e.g., code reviews, testing, optimization, debugging).

 

Technical Leadership & Other Skills

  • Collaborate closely with architects, business analysts, QA teams, and risk domain SMEs.
  • Collaborate with Risk and Compliance teams to validate calculations and adhere to regulatory frameworks.
  • Lead technical discussions around pipeline design, system integration, API frameworks, and data modeling.
  • Support code reviews, peer collaboration, and best‑practice adoption across teams.
  • Experience with data governance, metadata management, and enterprise data quality frameworks.
  • Effective communication, documentation, and analytical skills.

 

Basic Qualifications:

  • 5+ years experience in technical leadership roles.
  • 5+ years experience analyzing and profiling Snowflake data, ensuring data quality, lineage understanding, and accurate translation of business/regulatory requirements into technical specifications.
  • 5+ years experience designing and engineering end‑to‑end ETL/ELT pipelines across SnowflakeARK, and PostgreSQL, implementing complex transformation logic, business rules, and FED‑reporting computations.
  • 7+ years experience building and maintaining Python microservices and Flask‑based REST APIs for data orchestration, ingestion, and integration with downstream systems.
  • Experience with implementing authentication, authorization, error handling, and logging within APIs and services.
  • 7+ years experience developing optimized PostgreSQL schemas, stored procedures, and performance‑tuned queries to support analytics and regulatory reporting.
  • 5+ years experience engineering Power BI reporting layers, semantic models, DAX measures, and high‑performance datasets for enterprise risk dashboards and regulatory reporting.

 

Desired Skills:

  • Ability to ensure data accuracy, perform validation and reconciliation, and resolve complex reporting and refresh pipeline issues.
  • Experience leveraging AI tools (Copilot, Claude, Foundry) to accelerate development, automate code generation, enhance documentation, and support AI‑assisted analysis and testing.
  • Experience integrating CI/CD pipelines for automated build, test, and deployment.

 

Where required by law, NTT DATA provides a reasonable range of compensation for specific roles. The starting pay range for this remote role is $92,408 - $213,906. This range reflects the minimum and maximum target compensation for the position across all US locations. Actual compensation will depend on a number of factors, including the candidate’s actual work location, relevant experience, technical skills, and other qualifications.

 

#LI-NorthAmerica

INDFS

 

About NTT DATA

NTT DATA is a $30 billion business and technology services leader, serving 75% of the Fortune Global 100. We are committed to accelerating client success and positively impacting society through responsible innovation. We are one of the world's leading AI and digital infrastructure providers, with unmatched capabilities in enterprise-scale AI, cloud, security, connectivity, data centers and application services. our consulting and Industry solutions help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have experts in more than 50 countries. We also offer clients access to a robust ecosystem of innovation centers as well as established and start-up partners. NTT DATA is a part of NTT Group, which invests over $3 billion each year in R&D.

Whenever possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client’s needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, https://us.nttdata.com/en/contact-us.

NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-usThis contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.


Nearest Major Market: Buffalo

Job Segment: Testing, Cloud, Database, Quality Assurance, SQL, Technology

Apply now »