Share this Job

Lead Technical Architect- Big Data

Apply now »

Date: Oct 12, 2021

Location: Phoenix, AZ, US

Company: NTT DATA Services

Job Responsibilities Include:

 

  • Lead a team of Data Engineers and Data Analysts. Responsible for delivering or leading the design, development, support, maintenance deployment of functional applications as well as information support processes.
  • Responsible to complete tasks and associated documentation within committed timeframes, and effectively communicate across teams and all levels of management and leads others on the team with task assignments and deliverables.
  • Develops, maintains and a reviewer of knowledge management artifacts and leads transition to operations functions following implementation delivery.
  • Understanding multiple data sources like XMLs’, Relational, CSV, Txt, and Excel to feed into Data Lake.
  • Work with technical and business stakeholders to understand the data sources, ingestion, movement, structures, and processes in existing data warehouses to be able to generate requirements and designs.
  • Extracting data from Data Lake and feeding into Data warehouse and vice versa.
  • Extraction of data incrementally and full load as per requirements.
  • Follow configuration and change management process.
  • Ability to solve problems with data
  • Experience working with public cloud environment (AWS preferred)
  • High attention to data accuracy
  • Ability to work in an agile team
  • Critical thinking to ask questions, determine best course and offer solutions
  • Effective analytical and decision-making skills
  • Strong interpersonal skills to build relationships and communicate effectively with managers, co-workers and Business users
  • Demonstrated ability to work effectively in a fast-paced, complex, and dynamic business environment
  • Enjoy being challenged and to solve complex problems on a daily basis
  • Coordinate with team members across different geographies.
  • Support new proposals and POCs

 

 

 

Basic Qualifications: 

 

  • 8 plus years of experience in IT industry.
  • 5 plus years of experience on Hadoop technologies like Hbase, MapReduce, HDFS, Impala, Hive, Pig, Sqoop, SQL
  • 3 plus years of experience on Spark using Java
  • 3 plus years of experience on Java 8+
  • Bachelor/Master degrees of Computer Application or equivalent engineering/IT degree from an accredited University.

 

Preferred Skills:

 

  • Experience deploying applications in AWS environment; ability to architect, design, deploy and manage cloud based Hadoop clusters. Working knowledge of AWS Glue, Redshift and other AWS services.
  • Experience in Healthcare (preferred)
     

#INDFSINS

#L1-NAM

PAS2 

 


Nearest Major Market: Phoenix

Job Segment: IT Architecture, Database, Developer, SQL, Java, Technology