āđāļŠāļāļāļāļĨ 1 - 2 āļāļģāđāļŦāļāđāļāļāļēāļ āļāļēāļāļāļąāđāļāļŦāļĄāļ 2 āļāļģāđāļŦāļāđāļāļāļēāļ
āļāļĩāđāļĄāļĩāļāļģāļ§āđāļē Hive
āđāļĢāļĩāļĒāļāļāļēāļĄ
āļāļĢāļļāļāļēāđāļĨāļ·āļāļ
- No elements found. Consider changing the search query.
āļāļąāļāđāļŦāļĨāļāđāļĢāļāļđāđāļĄāđāļāļāļāļāļļāļ AI āļāļāļāđāļĢāļēāļāļ°āļ§āļīāđāļāļĢāļēāļ°āļŦāđāđāļĨāļ°āđāļāļ°āļāļģāļāļģāđāļŦāļāđāļāļāļēāļāļāļĩāđāļāļĩāļāļĩāđāļŠāļļāļāđāļŦāđāļāļļāļ
āļāļĢāļļāļāđāļāļ, āđāļāļāļĩ / āđāļāļĩāļĒāļāđāļāļĢāđāļāļĢāļĄ
,āļāļĩāđāļāļĢāļķāļāļĐāļē
āđāļāļāļĩ / āđāļāļĩāļĒāļāđāļāļĢāđāļāļĢāļĄ,āļāļĩāđāļāļĢāļķāļāļĐāļē
āļāļĢāļ°āļŠāļāļāļēāļĢāļāđ:
6 āļāļĩāļāļķāđāļāđāļ
āļāļąāļāļĐāļ°:
Big Data, Good Communication Skills, Scala
āļāļĢāļ°āđāļ āļāļāļēāļ:
āļāļēāļāļāļĢāļ°āļāļģ
āđāļāļīāļāđāļāļ·āļāļ:
āļŠāļēāļĄāļēāļĢāļāļāđāļāļĢāļāļāđāļāđ
- Collate technical and functional requirements through workshops with senior stakeholders in risk, actuarial, pricing and product teams.
- Translate business requirements to technical solutions leveraging strong business acumen.
- Analyse current business practice, processes, and procedures as well as identifying future business opportunities for leveraging Data & Analytics solutions on various platforms.
- Develop solution proposals that provide details of project scope, approach, deliverables and project timeline.
- Provide architectural expertise to sales, project and other analytics teams.
- Identify risks, assumptions, and develop pricing estimates for the Data & Analytics solutions.
- Provide solution oversight to delivery architects and teams.
- Skills and attributes for success.
- 6-8 years of experience in Big Data, data warehouse, data analytics projects, and/or any Information Management related projects.
- Prior experience building large scale enterprise data architectures using commercial and/or open source Data Analytics technologies.
- Ability to estimate complexity, effort and cost.
- Ability to produce client ready solution architecture, business understandable presentations and good communication skills to lead and run workshops.
- Strong knowledge of data manipulation languages such as Spark, Scala, Impala, Hive SQL, Apache Nifi and Kafka necessary to build and maintain complex queries, streaming and real-time data pipelines.
- Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP, or data vault.
- Good fundamentals around security integration including Kerberos authentication, SAML and data security and privacy such as data masking and tokenisation techniques.
- Good knowledge in DevOps engineering using Continuous Integration/ Delivery tools.
- An in depth understanding of Cloud solutions (AWS, Azure and/or GCP) and experienced in integrating into traditional hosting/delivery models.
- Ideally, you ll also have.
- Experience in engaging with both technical and non-technical stakeholders.
- Strong consulting experience and background, including engaging directly with clients.
- Demonstrable Cloud experience with Azure, AWS or GCP.
- Configuration and management of databases.
- Experience with big data tools such as Hadoop, Spark, Kafka.
- Experience with AWS and MS cloud services.
- Python, SQL, Java, C++, Scala.
- Highly motivated individuals with excellent problem-solving skills and the ability to prioritize shifting workloads in a rapidly changing industry. An effective communicator, you ll be a confident leader equipped with strong people management skills and a genuine passion to make things happen in a dynamic organization.
- What working at EY offers.
- Support, coaching and feedback from some of the most engaging colleagues around.
- Opportunities to develop new skills and progress your career.
- The freedom and flexibility to handle your role in a way that s right for you.
- about EY
- As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. So that whenever you join, however long you stay, the exceptional EY experience lasts a lifetime.
- If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.
- Join us in building a better working world. Apply now!.
3 āļ§āļąāļāļāļĩāđāļāđāļēāļāļĄāļē
āļāļđāđāļāļīāđāļĄāđāļāļīāļĄkeyboard_arrow_down
āļāļąāļāļāļķāļ
āļĒāļāđāļĨāļīāļ
āļāļĨāļāļāđāļāļĒ, āļāļĢāļļāļāđāļāļ,
āļāļąāļāļĐāļ°:
Python, SQL, Java
āļāļĢāļ°āđāļ āļāļāļēāļ:
āļāļēāļāļāļĢāļ°āļāļģ
āđāļāļīāļāđāļāļ·āļāļ:
āļŠāļēāļĄāļēāļĢāļāļāđāļāļĢāļāļāđāļāđ
- Data Pipeline Development: Design, implement, and maintain data analytics pipelines and processing systems.
- Data Modeling: Apply data modeling techniques and integration patterns to ensure data consistency and reliability.
- Data Transformation: Write data transformation jobs through code to optimize data processing.
- Data Management: Perform data management through data quality tests, monitoring, cataloging, and governance.
- LLM Integration: Design and integrate LLMs into existing applications, ensuring smooth functionality and performance.
- Model Development and Fine-Tuning: Develop and fine-tune LLMs to meet specific business needs, optimizing for accuracy and efficiency.
- Performance Optimization: Continuously optimize LLM performance for speed, scalability, and reliability.
- Infrastructure Knowledge: Possess knowledge of the data and AI infrastructure ecosystem.
- Collaboration: Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
- Continuous Learning: Demonstrate a willingness to learn and find solutions to complex problems..
- Education: Bachelor's or Master's degree in Computer Science, AI, Engineering, or a related field.
- Experience: At least 3 years of experience in data engineering.
- Technical Skills: Proficiency in Python, SQL, Java, experience with LLM frameworks (e.g., LangChain), and familiarity with cloud computing platforms. Additional, visualization tools i.e Power BI, Tableau, Looker, Qlik.
- Cloud Computing: Familiarity with cloud computing platforms, such as GCP, AWS, or Databricks.
- Problem-Solving: Strong problem-solving skills with the ability to work independently and collaboratively.
- Desirable.
- System Design: Knowledge of system design and platform thinking to build sustainable solutions.
- Big Data Experience: Practical experience with modern and traditional Big Data stacks (e.g., BigQuery, Spark, Databricks, duckDB, Impala, Hive).
- Data Warehouse Solutions: Experience working with data warehouse solutions, ELT tools, and techniques (e.g., Airflow, dbt, SAS, Nifi).
- API Development: Experience with API design to facilitate integration of LLMs with other systems.
- Prompt Engineering: Skills in designing sequential tasks for LLMs to achieve efficient and accurate outputs.
- Visualization Solution: Skills in design and develop dashboard for analytic & insight.
- Agile Methodologies: Experience with agile software delivery and CI/CD processes.
2 āļ§āļąāļāļāļĩāđāļāđāļēāļāļĄāļē
āļāļđāđāļāļīāđāļĄāđāļāļīāļĄkeyboard_arrow_down
āļāļąāļāļāļķāļ
āļĒāļāđāļĨāļīāļ
āļŠāđāļāđāļāđāļāđāļāļ·āļāļāļāļēāļāđāļŦāļĄāđāļĨāđāļēāļŠāļļāļāļŠāļģāļŦāļĢāļąāļHive
- 1
āļĒāļāļāļāļīāļĒāļĄ
āļĨāļāļāļāļģ 5 āļŠāļīāđāļāļāļĩāđāļŦāļĨāļąāļāđāļĨāļīāļāļāļēāļ āļāļĩāļ§āļīāļāļāļļāļāļāļ°āđāļāļĨāļĩāđāļĒāļāđāļāļāļĨāļāļāļāļēāļĨ
āļāļģāđāļāļ°āļāļģāļāđāļēāļāļāļēāļāļĩāļāļāļĢāļīāļĐāļąāļ 7 āđāļāļāļāļĩāđāļāļļāļāđāļĄāđāļāļ§āļĢāļāļģāļāļēāļāļāđāļ§āļĒ
āļāļģāđāļāļ°āļāļģāļāļēāļĢāļŦāļēāļāļēāļāđāļāļīāļāđāļāļĨāļŠāļļāļāļĒāļāļ 50 āļāļĢāļīāļĐāļąāļāļāļĩāđāļāļāļĢāļļāđāļāđāļŦāļĄāđāļāļĒāļēāļāļĢāđāļ§āļĄāļāļēāļāļāđāļ§āļĒāļĄāļēāļāļāļĩāđāļŠāļļāļ 2025
āļāđāļēāļ§āļŠāļēāļĢāđāļŦāļĄāđāđ
