Business Development

Data Engineer, Integrations

Experience Level: Mid

Apply Now

Do you enjoy diving deep into business processes and providing guidance on improving them through leveraging technology? This role will be focused on integrating the many systems used by different internal groups and providing optimization of processes and flow of the data.

 

As a data engineer you will:

  • Help define, document and better understand the workflow involved with business processes
  • Translate those workflows into opportunities for improvement through technology, from better ingestion of data to complex automated workflow
  • Dive into and deeply understand new data sources and their underlying data libraries to transform, integrate, and make them accessible for self-directed analysis by stakeholders across the business
  • Help architect and select the right tools and technologies to provide data movement and organization that is well organized, secure and performant
  • Lead with the cloud, focus on engineering and leverage services as opposed to installing and maintaining infrastructure
  • Explore and introduce data that enriches our existing data sources, perform analysis to understand and articulate the value of the augmentation and enhancement

 

You will have:

  • 5+ years of experience building production scale integrations, with hands-on experience using iPaaS tools like Dell Boomi, MuleSoft, Tibco, Jitterbit, OIC, Workato, etc (Workato preferred)
  • Experience consuming application API with REST, SOAP, XML, JSON, SQL, Messaging frameworks
  • Experience with Python or other Object-Oriented languages preferred
  • Strong communication skills and the ability to present and visualize business processes, data flow and systems architecture
  • A desire to participate in all aspects of the development lifecycle from inception to implementation and support
  • The ability to work with vague requirements to drive impactful solutions
  • Experience designing and writing enterprise-wide workflows from scratch in Python using Airflow, and Docker, etc.
  • Designing and implementing data systems including data warehouses, operational data stores and long-term storage mechanisms is a plus
  • Experience with SQL development, and data modeling is a plus
  • Snowflake experience is a plus
  • Knowledge of data engineering best practices across the development lifecycle, including coding standards, code reviews, source management, build processes, testing, and operations

 

About GLG / Gerson Lehrman Group

GLG is the world’s insight network. Our clients rely on GLG’s global team to connect with powerful insight across fields from our network of 900,000+ experts (and the hundreds of new experts we recruit every day).

We serve thousands of the world’s best businesses, from Fortune 500 corporations to leading technology companies to professional services firms and financial institutions. We connect our clients to the world’s largest and most varied source of first-hand expertise, including executives, scientists, academics, former public-sector leaders, and the foremost subject matter specialists.

GLG’s industry-leading compliance framework allows clients to learn in a structured, auditable, and transparent way, consistent with their own internal compliance obligations and the highest professional ethical standards. Our compliance standards are a major competitive differentiator and key component of the company’s culture.

To learn more, visit www.GLGinsights.com.