PlanoRecruiter Since 2001
the smart solution for Plano jobs

Feature Lead Technology

Company: Bank of America
Location: Plano
Posted on: January 15, 2022

Job Description:

Job Description:

Position Summary

Responsibilities
--- Lead projects end-to-end. Meet with business users to determine requirements, analyze the data lake for relevant datasets, collaborate with other developers to design a technical solution, and see the project through to completion.
--- Design and build analytical workflows that take data from the lake and transforms, filters, aggregates, and computes a meaningful result or report for use by a consumer application
--- Develop highly scalable and extensible Big Data platform, which enables collection, storage, modeling, and analysis of massive data sets from numerous channels
--- Continuously evaluate new technologies, innovate and deliver solution for business critical applications

Required Skills
--- Bachelor's / Master's degree in Computer Science or equivalent experience
--- Minimum 8 years of Software Development experience
--- Minimum 5 years of experience with the Hadoop/Cloudera ecosystem, such as Spark, MapReduce, Yarn, Hive, Sqoop, Impala, Kafka and Oozie
--- Experience with Apache Spark
--- Experience with Unix / Linux and shell scripting
--- Experience with two or more Programming Languages (SQL, Java , Python, Scala, R)
--- Experience leading software projects, from the design through release phases
--- Experience using the Data lake to design and produce analytical output through batch and real-time processing
--- Strong understanding of capacity planning, software development lifecycle, and enterprise production deployments
--- Hands-on experience benchmarking systems, analyzing bottlenecks, and designing performant code

Preferred Skills
--- SDLC Methodology - Agile / Scrum / Iterative Development
--- Job Scheduling Tools (Autosys)
--- Version Control System (Git, Bitbucket)
--- Continuous Integration / Continuous Delivery (CI/CD) pipelines (Jenkins)
--- Real Time Streaming (Kafka)
--- Visual Analytics Tools (Tableau)
--- No SQL Technologies (Hbase)
--- Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )
--- Awareness or experience with Data Lake with Cloudera ecosystem

Job Band:
H5

Shift:
1st shift (United States of America)

Hours Per Week:
40

Weekly Schedule:

Referral Bonus Amount:
0
--> Job Description:

Position Summary

Responsibilities
--- Lead projects end-to-end. Meet with business users to determine requirements, analyze the data lake for relevant datasets, collaborate with other developers to design a technical solution, and see the project through to completion.
--- Design and build analytical workflows that take data from the lake and transforms, filters, aggregates, and computes a meaningful result or report for use by a consumer application
--- Develop highly scalable and extensible Big Data platform, which enables collection, storage, modeling, and analysis of massive data sets from numerous channels
--- Continuously evaluate new technologies, innovate and deliver solution for business critical applications

Required Skills
--- Bachelor's / Master's degree in Computer Science or equivalent experience
--- Minimum 8 years of Software Development experience
--- Minimum 5 years of experience with the Hadoop/Cloudera ecosystem, such as Spark, MapReduce, Yarn, Hive, Sqoop, Impala, Kafka and Oozie
--- Experience with Apache Spark
--- Experience with Unix / Linux and shell scripting
--- Experience with two or more Programming Languages (SQL, Java , Python, Scala, R)
--- Experience leading software projects, from the design through release phases
--- Experience using the Data lake to design and produce analytical output through batch and real-time processing
--- Strong understanding of capacity planning, software development lifecycle, and enterprise production deployments
--- Hands-on experience benchmarking systems, analyzing bottlenecks, and designing performant code

Preferred Skills
--- SDLC Methodology - Agile / Scrum / Iterative Development
--- Job Scheduling Tools (Autosys)
--- Version Control System (Git, Bitbucket)
--- Continuous Integration / Continuous Delivery (CI/CD) pipelines (Jenkins)
--- Real Time Streaming (Kafka)
--- Visual Analytics Tools (Tableau)
--- No SQL Technologies (Hbase)
--- Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )
--- Awareness or experience with Data Lake with Cloudera ecosystem

Job Band:
H5

Shift:
1st shift (United States of America)

Hours Per Week:
40

Weekly Schedule:

Referral Bonus Amount:
0
Job Description: Position Summary

Responsibilities
--- Lead projects end-to-end. Meet with business users to determine requirements, analyze the data lake for relevant datasets, collaborate with other developers to design a technical solution, and see the project through to completion.
--- Design and build analytical workflows that take data from the lake and transforms, filters, aggregates, and computes a meaningful result or report for use by a consumer application
--- Develop highly scalable and extensible Big Data platform, which enables collection, storage, modeling, and analysis of massive data sets from numerous channels
--- Continuously evaluate new technologies, innovate and deliver solution for business critical applications

Required Skills
--- Bachelor's / Master's degree in Computer Science or equivalent experience
--- Minimum 8 years of Software Development experience
--- Minimum 5 years of experience with the Hadoop/Cloudera ecosystem, such as Spark, MapReduce, Yarn, Hive, Sqoop, Impala, Kafka and Oozie
--- Experience with Apache Spark
--- Experience with Unix / Linux and shell scripting
--- Experience with two or more Programming Languages (SQL, Java , Python, Scala, R)
--- Experience leading software projects, from the design through release phases
--- Experience using the Data lake to design and produce analytical output through batch and real-time processing
--- Strong understanding of capacity planning, software development lifecycle, and enterprise production deployments
--- Hands-on experience benchmarking systems, analyzing bottlenecks, and designing performant code

Preferred Skills
--- SDLC Methodology - Agile / Scrum / Iterative Development
--- Job Scheduling Tools (Autosys)
--- Version Control System (Git, Bitbucket)
--- Continuous Integration / Continuous Delivery (CI/CD) pipelines (Jenkins)
--- Real Time Streaming (Kafka)
--- Visual Analytics Tools (Tableau)
--- No SQL Technologies (Hbase)
--- Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )
--- Awareness or experience with Data Lake with Cloudera ecosystem Shift:
1st shift (United States of America) Hours Per Week:
40

Keywords: Bank of America, Plano , Feature Lead Technology, IT / Software / Systems , Plano, Texas

Click here to apply!

Didn't find what you're looking for? Search again!

I'm looking for
in category
within


Log In or Create An Account

Get the latest Texas jobs by following @recnetTX on Twitter!

Plano RSS job feeds