Jobs at Parexel
Are you ready to make your mark in the healthcare industry using your ingenuity and technical expertise? At Parexel, there's no limit to what you can accomplish. We work with the top 50 global pharmaceutical companies, the top global biotech companies, and hundreds of small and emerging companies. We've supported the development of some of the most important treatments of our time, including nearly all of the 50 top selling drugs currently on the market. Join us and you'll make a profound difference in millions of lives. With a role in our Information Technology group, you'll work with cutting-edge technology on internal and client-facing projects. Collaborating with people from all over the world and touching everything from enterprise systems and clinical applications to infrastructure and networking, you'll keep our data safe and confidential. You'll have the chance to develop your skills and pursue advancement opportunities that include managerial and technical tracks. If you have stellar coding and design experience and a passion to learn more about healthcare, Parexel invites you to discover our rewarding opportunities in Enterprise Architecture, Software Development, Project Management and Business Analysis, and more. In a global clinical trial environment, business-focused applications, connectivity, and data security are critical to a trial's outcome. Find out how you can be an invaluable part of our success story.
Senior Data Engineer
Job ID R0000036318 , United KingdomParexel is seeking a highly experienced Senior Data Engineer to architect, develop, and optimize enterprise-grade data pipelines and platforms using Azure, Databricks, Snowflake and Power BI. This role is pivotal in transforming raw data into actionable insights and building a resilient, scalable data ecosystem that supports business-critical functions across clinical and operational domains.
Key Responsibilities:
Architect and implement end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake for large-scale data ingestion, transformation, and storage.
Using Microsoft Azure data PaaS services, design, build, modify, and support data pipelines leveraging DataBricks and PowerBI in a medallion architecture setting
If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders
Excellent grasp of and expertise with test-driven development and continuous integration processes
Analysis and Design – Converts high-level design to low-level design and implements it
Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalize work plans
Run unit and integration tests on all created code – Create and run unit and integration tests throughout the development lifecycle
Benchmark application code proactively to prevent performance and scalability concerns
Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management
Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, Staging, and Production environments
Familiarity with PowerBI and Reltio is advantageous but not required
Collaborate with BI teams to ensure data models are optimized for reporting in Power BI, with a focus on performance and usability.
Establish data governance, quality, and security controls, ensuring compliance with GDPR, HIPAA, and global clinical data regulations.
Mentor and guide junior engineers, fostering technical excellence and knowledge sharing.
Drive automation and CI/CD practices within data engineering pipelines, integrating with version control and deployment workflows.
Work closely with Data Architects, Business Analysts, and Product Owners to translate business needs into technical solutions.
Required Qualifications:
Experience: 6+ years of data engineering experience, with at least 4 years hands-on in Azure, Databricks, and Snowflake; experience with Reltio and Power BI integration is highly desirable.
Education: Bachelor’s or master’s degree in computer science, Information Systems, Engineering, or a related field.
Skills:
Expert-level knowledge of Azure Data Factory, Databricks, and Snowflake.
Understanding of quality processes and estimate methods
Understanding of design concepts and architectural basics
Fundamental grasp of the project domain
The ability to transform functional and nonfunctional needs into system requirements.
The ability to develop and code complicated applications is required.
The ability to create test cases and scenarios based on specifications.
Solid knowledge of SDLC and agile techniques
Knowledge of current technology and trends
Logical thinking and problem-solving abilities, as well as the capacity to collaborate
Primary skills: Cloud Platform, Azure, Databricks, ADF, ADO
Advantageous: SQL, Python, PowerBI
General Knowledge: PowerApps, Java/Spark, Reltio
3-5 years of experience in software development with minimum 2 years of cloud computing
Proven experience in building BI-ready datasets and performance tuning in Power BI
Proficient in SQL, Python, and cloud-native architecture.
Strong grasp of data security, privacy compliance, and best practices in a regulated environment.
#LI-REMOTE
Recently Viewed Jobs
You have not recently viewed any jobs
View all jobs