LynchburgRecruiter Since 2001
the smart solution for Lynchburg jobs

Principal Software Engineer/Developer - 2007701

Company: Fidelity Investments
Location: Durham, North Carolina
Posted on: May 8, 2020

Job Description:

Implements batch and real-time Big Data integration frameworks and applications, using Hadoop, Spark, Kafka. Identifies and ingests new data sources and performs feature engineering for integration into models using Object Oriented Programming. Uses business knowledge to translate the vision for divisional initiatives into business solutions by developing complex or multiple software applications and conducting studies of alternatives. Analyzes and recommends changes in project development policies, procedures, standards, and strategies to development experts and management.

Primary Responsibilities:

Participates in architecture design teams.

Defines and implements application level architecture.

Develops applications on complex projects, components, and subsystems for the division.

Recommends development testing tools and methodologies and reviews and validates test plans.

Responsible for QA readiness of software deliverables.

Develops comprehensive documentation for multiple applications or subsystems.

Establishes full project life cycle plans for complex projects across multiple platforms.

Responsible for meeting project goals on-time and on-budget.

Advises on risk assessment and risk management strategies for projects.

Plans and coordinates project schedules and assignments for multiple projects.

Acts as a primary liaison for business units to resolve various project/technology issues.

Provides technology solutions to daily issues and technical evaluation estimates on technology initiatives.

Advises senior management on technical strategy.

Mentors junior team members.

Performs independent and complex technical and functional analysis for multiple projects supporting several divisional initiatives.

Develops original and creative technical solutions to on-going development efforts.

Education and Experience:

Bachelors degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and five (5) years of experience in the job offered or five (5) years of experience designing and developing big data applications for Cloud platforms (Amazon Web Services, Azure, Google, and IBM) using Agile Methodologies.

Or, alternatively, Masters degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and three (3) years of experience in the job offered or three (3) years of experience designing and developing big data applications for Cloud platforms (Amazon Web Services, Azure, Google, and IBM) using Agile Methodologies.

Skills and Knowledge:

Candidate must also possess:

Demonstrated Expertise (DE) designing, developing, and data modeling big data applications, using Unix scripting, Python, Apache Spark, Spark SQL, and Spark Analytical functions within Hadoop eco systems; and designing, developing, implementing, and troubleshooting big data Hadoop applications within production and non-production environments to support predictive analytics.

DE designing and developing event driven, micro-service-based applications, using stream-based, technologies (Spring Boot, Kafka, Kafka Streams, and Flume) and programming languages (Java and Scala); and creating, managing, modeling, and performance tuning SQL queries and AWS Redshift cluster.

DE implementing big data computing strategies, including, CI/CD processes on enterprise applications using automation tools -- Maven and Jenkins; and deploying big data applications according to standard security (IAM, and fine grain access controls including, role/policy based access and encryption schemes) and architectural requirements (domain/data modeling, optimized file formats, and Big Data architectures -- Lambda Architecture, Data Lakes, and Data Virtualization) to Cloud, using AWS Elastic Beanstalk, Jenkins pipelines, and Docker.

DE designing, developing, and automating data extraction, ingestion, and transformation layers in Hadoop using Big Data Technologies -- Flume, Sqoop, Hive, and Java MapReduce framework -- within on-premises and S3, EC2, and EMR on Cloud based environments (Amazon Web Services).

For full job details and to apply, please visit https://jobs.fidelity.com/ and search for job number: 2007701.

Keywords: Fidelity Investments, Lynchburg , Principal Software Engineer/Developer - 2007701, Finance , Durham, North Carolina, Virginia


Didn't find what you're looking for? Search again!

I'm looking for
in category
within


Log In or Create An Account

Get the latest Virginia jobs by following @recnetVA on Twitter!

Lynchburg RSS job feeds