View Job

Data Lake/Big Data Architect - Charlotte, NC

Bank of America | Charlotte NC 28255 USA | Full Time | Posted: 11/12/2019

Job Descriptiontop

Job Description:

Participates in design, development and implementation of architectural deliverables, to include components of the assessment and optimization of system design and review of user requirements. Contributes to the determination of technical and operational feasibility of solutions. Develops prototypes of the system design and works with database, operations, technical support and other IT areas as appropriate throughout development and implementation processes. May lead multiple projects with competing deadlines. Serves as a fully seasoned/proficient technical resource; provides tech knowledge and capabilities as team member and individual contributor. Will not have direct reports but will influence and direct activities of a team related to special initiatives or operations, as well as mentor junior band 5 Architect 1's. Provides input on staffing, budget and personnel. Typically 7 or more years of architecture experience.

Enterprise Risk Finance Technology (ERFT):

  • Believes diversity makes us stronger so we can reflect, connect and meet the diverse needs of our clients and employees around the world.
  • Is committed to building a workplace where every employee is welcomed and given the support and resources to perform their jobs successfully.
  • Wants to be a great place for people to work and strives to create an environment where all employees have the opportunity to achieve their goals.
  • Provides continuous training and development opportunities to help employees achieve their career goals, whatever their background or experience.
  • Is committed to advancing our tools, technology, and ways of working to better serve our clients and their evolving business needs.
  • Believes in responsible growth and is dedicated to supporting our communities by connecting them to the lending, investing and giving them what they need to remain vibrant and vital.


The Data Lake or Big Data architect is responsible for implementation of our strategic Big Data platform, Data Lake and Analytic applications. The candidate is responsible for planning, architecting data lake, analytics, and machine learning applications on big data platform. The Candidate should have experience with Data Lake, Data Ingestion, Data Wrangling, and Data Processing and Data Mining frameworks and technologies. This position requires solid attention to details, deep technical expertise, superb communications and presentation skills.

The Architect, is responsible for:

Architect next-generation Hadoop Data Lake from scratch and analytics applications on a group of core Hadoop technologies.

Evaluate new technologies and open source or third party products, and research to identify opportunities that impact business strategy, meet business requirements.

Proactively bring new ideas, research emerging technologies and discuss with business and management to transform the current state technologies to future state.

Develop highly scalable and extensible Big Data platform which enables collection, storage, modeling, and analysis of massive data sets from numerous channels.

Perform architecture design, data modeling, and implementation of Big Data platform

Enable big data and batch/real-time analytical solutions that leverage emerging technologies.

Facilitate getting data from a variety of different sources, assuring that it adhere to Enterprise data management standards.

Develop Standards, guidelines, and best practices documentation for development teams

Conduct performance tuning of Hadoop applications.

Required Skills:

  • 8+ years of overall IT experience including the following:
  • Hands-on experience with “big data” platforms and tools including data ingestion (batch & real time), transformation and delivery in Hadoop ecosystem (such as Hadoop, Pig, Hive, Flume, Ozie, Avro, YARN, Kafka, Storm)
  • Proficiency in Spark, Scala, Hive, Impala, Mahout
  • Experience in architecture and implementation of large and highly complex projects using Cloudera or Hortonworks.
  • Deep understanding of cloud computing infrastructure and platforms including load balancing, networks, scaling, in-memory
  • Understanding of Graph Databases
  • Hands on experience to fix any critical barriers during project delivery
  • History of working successfully with multi location, cross-functional engineering teams.
  • Capability to architect highly scalable distributed systems, using different open source tools.
  • Strong knowledge of various DMBS systems including No SQL architectures and design principles.
  • Proven experience in migrating data and applications from relational repositories to big data solutions

Desired Skills:

  • M.S. Computer Science or Data Science
  • Previous experience in Financial Crimes or Anti Money Laundering


1st shift (United States of America)

Hours Per Week:


Job Detailstop

Location Charlotte, NC, 28255, United States
Categories Information Technology

Location Maptop

Contact Informationtop

Contact Name -
How to apply Employer provided a link where your application will be accepted. Click on the link below and follow instructions.
Apply Click Here (apply to job)
Job Code 19057249

Featured Employers - view all