A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe.
You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat.
Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role you'll be encouraged to challenge the norm investigate ideas outside of your role and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.
- Support the development of data pipelines: Assist in building maintaining and optimizing ETL/ELT workflows to ingest process and transform data from various sources.
- Work with cloud data platforms: Help configure test and monitor cloud-based data environments (e.g. Snowflake GCP AWS Azure).
- Data modeling & warehousing: Contribute to the design and implementation of scalable data models tables and schemas used for analytics and reporting.
- Documentation: Maintain clear documentation of data processes pipeline logic and system configurations.
- Collaboration: Work closely with senior data engineers analysts and business teams to understand requirements and translate them into technical tasks.
- Learning & development: Continuously learn new tools technologies and best practices related to data engineering and cloud platforms.
- Automation & optimization: Assist in identifying opportunities to automate repetitive tasks and improve the performance of existing data workflows.
- Basic understanding of data engineering concepts and cloud data platforms.
- Interest in learning and working with modern data tools such as Snowflake or similar cloud-based technologies.
- Familiarity with fundamental data concepts such as ETL/ELT data modeling and data quality.
- Motivation to develop skills in cloud ecosystems and modern data frameworks.
- Practical knowledge of SQL and experience writing queries for data extraction and manipulation.
- Understanding of data warehousing concepts star/snowflake schemas and data pipeline orchestration.
- Experience or familiarity with cloud platforms such as GCP AWS Azure Fabric or Snowflake.
- Exposure to modern data engineering tools and services (e.g. BigQuery Databricks Dataflow...).