The Core Engineering team builds the platform technology that connects GLG’s clients to our experts. Our team of developers, IT engineers, operations engineers, data engineers and product managers are responsible for GLG’s technology initiatives, product development, and IT operations.
We are seeking a Senior-Principal Data Engineer to join our Core Engineering team to build the data pipeline for the entire organization. GLG is building a next generation platform to transform the existing way we deal with data across the various facets of our company. In the role you will be responsible for helping to source, organize and provide accessibility to numerous engineering teams, business stakeholders and external partners.
Specific responsibilities include (but are not limited to):
- Cleanse, normalize and enhance quality for both the existing operational systems as well as new data sources that flow through the data platform.
- Explore and introduce data that enriches our existing data sources, perform analysis to understand and articulate the value of the augmentation and enhancement.
- Help architect and select the right tools and technologies to provide data movement and organization that is well organized, secure and performant.
- Coach and educate other engineers across the entire organization on how take advantage of a central repository.
- Work the latest approaches to creating physical and virtual data layers, for example Apache Spark, Azure Data Lake, S3, etc.
- Lead with the cloud, focus on engineering and leverage services as opposed to installing and maintaining infrastructure.
- Partner with the business users to understand their business processes and requirements for data storage, accesses and impact.
- Focus on predictive analytics, next generation tools and approaches that find insight in our data.
An ideal candidate will have the following:
- At least 10 years of experience as a data engineer, data architect, data modeler, etc, designing and implementing data systems including data warehouses, operational data stores and long-term storage mechanisms.
- A desire to participate in all aspects of the development lifecycle from inception to implementation and support.
- Must have expertise in SQL and Python.
- Experience using Data Engineering and Data Science libraries including PyTorch, Matplotlib, Numpy, SciPy, Pandas, Seaborn, TensorFlow, etc. is required
- Experience with 3 or more of the following:
- Apache Spark
- Data Science
- Solid communication skills and the ability to present and visualize business processes, data flow and systems architecture.
- Must be able to work with vague requirements to drive impactful solutions.
- Data governance experience is preferred.
About GLG / Gerson Lehrman Group
GLG is the world’s knowledge marketplace. Our clients rely on GLG’s global team to connect with powerful insight across fields from our network of 700,000+ experts (and the hundreds of new experts we recruit every day).
We serve thousands of the world’s best businesses, from Fortune 500 corporations to leading technology companies to professional services firms and financial institutions. We connect our clients to the world’s largest and most varied source of first-hand expertise, including executives, scientists, academics, former public-sector leaders, and the foremost subject matter specialists.
GLG’s industry-leading compliance framework allows clients to learn in a structured, auditable, and transparent way, consistent with their own internal compliance obligations and the highest professional ethical standards. Our compliance standards are a major competitive differentiator and key component of the company’s culture.
To learn more, visit www.GLG.it.