Job Description
Job DescriptionDo you love a career where you Experience, Grow & Contribute at the same time, while earning at least 10% above the market? If so, we are excited to have bumped onto you.
Learn how we are redefining themeaning of work, and be a part of the team raved by Clients, Job-seekers and Employees.
- Jobseeker Video Testimonials
- Employee Glassdoor Reviews
If you are a Data Engineerand looking for excitement, challenge and stability in your work, then you would be glad to come across this page.
We are an IT Solutions Integrator/Consulting Firm helping our clients hire the right professional for an exciting long term project. Here are a few details.
Check if you are up for maximizing your earning/growth potential, leveraging our Disruptive Talent Solution.
Role:Data Engineer
Location: NYC, NY -Onsite
Exp: 8+ years
Requirements
We are seeking a highly skilled Data Engineer to join our data engineering team at Prudential PGIM. The successful candidate will play a critical role in building, optimizing, and maintaining scalable and reliable data pipelines to support enterprise-level data analytics and business intelligence initiatives.
Key Responsibilities:
-
Develop, test, and maintain robust data pipelines using Python and PySpark.
-
Leverage Microsoft Fabric tools and services to manage data integration, ingestion, and transformation processes.
-
Ensure high-quality data delivery by implementing rigorous data validation and cleansing routines.
-
Automate workflows to streamline data processing and reduce manual intervention.
-
Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver scalable solutions.
-
Monitor, troubleshoot, and optimize pipeline performance and reliability.
-
Document processes, data flows, and best practices to ensure maintainability and compliance.
-
Participate in Agile/Scrum ceremonies and contribute to sprint planning and retrospectives.
Required Skills & Qualifications:
Skill
Requirement Level
MS Fabric
Hands-on experience with Microsoft Fabric, including Dataflows, Pipelines, and Lakehouses
SQL
Advanced SQL skills for querying, transforming, and analyzing large datasets
Python
Strong Python programming skills, especially in data manipulation and scripting
PySpark Notebooks
Experience with distributed data processing using PySpark in notebook environments
ETL Development
Experience designing and implementing Extract, Transform, Load (ETL) solutions
Data Quality
Familiarity with data profiling, validation, and quality assurance techniques
Automation
Proficiency in automation frameworks for data processes and workflows
Preferred Qualifications:
-
Experience with Azure Data Services, Databricks, or other cloud platforms.
-
Familiarity with Git, CI/CD pipelines, and DevOps practices in data engineering.
-
Background in financial services or investment management is a strong plus.
-
Knowledge of data governance and compliance standards.
Soft Skills:
-
Strong analytical and problem-solving abilities.
-
Excellent communication and interpersonal skills.
-
Ability to work independently and manage multiple priorities in a fast-paced environment.
-
Collaborative team player with a proactive attitude.
BenefitsVisit us at http://alignity.io/careers. Alignity Solutions is an Equal Opportunity Employer, M/F/V/D.
CEO Message: Click Here
Clients Testimonial: Click Here