- No elements found. Consider changing the search query.
Experience:
3 years required
Skills:
Quality Assurance, Assurance, ETL, English
Job type:
Full-time
Salary:
negotiable
- Act as a strategic partner providing analytic technology expertise and inputs to business functions resulting in optimal tech investment portfolio and effective digital solution recommendation from DG's expertise to meet business requirements.
- Take lead among stakeholders throughout the organization to understand data needs, identify issues or opportunities for leveraging company data to propose solutions for support decision making to drive business solutions.
- Takes lead to communicate to related parties (with related to analytic areas) i.e., ...
- Take lead in design and layout of information visualization for dashboards, presentations and infographics to analyze complex datasets in a simple and intuitive format in adherence to data visualization best practices.
- Develop, test, debug and integrate data visualization solutions and ensures visualization (including interaction) consistency on analytics projects.
- Take lead in technical quality assurance process to ensure the dashboard design and appropriate data modeling are support to all business requirements.
- Take lead to focuses on the surrounding digital environment i.e., other applications, and responsible to ensure that newly BI / Analytic implemented works seamlessly through proper interfaces to optimize work effectiveness. Also, along the course of implementation has a duty to ensure necessary data are integrated transformed and loaded, and system is brought into production successfully.
- Take lead to effectively deliver and cultivate business value by driving Data driven company by maximum adoption of analytic which aligned to digital transformation master plan.
- Control / manage / govern Level 2 support, identify, fix and configuration related problems regarding BI and visualization.
- Be Project manager for Data project and manager project scope, timeline and budget.
- Manage relationships with stakeholders and coordinate work between different parties as well as providing regular update.
- EXPERIENCE.
- Experience of at least 3 - 4 years in working in BI and analytic area with designing and building BI dashboards.
- Experience of at least 3 - 4 years in fully implemented IT projects (designing, developing and support).
- Experienced in Oil and Gas business would be a strong asset.
- Knowledge in data analytic and compution tools e.g. ETL/ELT tools, and/or data visualization tools, including cloud platform solutions.
- Knowledge in enterprise software and technology such as SAP ECC, SAP BW, Power BI, AWS etc.
- Able to construct complex data models to help visualize and interpret data.
- A continuous learner and challenged by new BI technologies.
- EDUCATION.
- Bachelor s degree in computer science, Computer Engineering, Information Technology, or related discipline.
- Certificate related in data analytics area would be a strong asset.
- OTHER REQUIREMENTS.
- A self starter attitude and eager to further develop new skills.
- Strong written and verbal English skills.
Skills:
ETL, Python
Job type:
Full-time
Salary:
negotiable
- รวมข้อมูลดิบจากแหล่งข้อมูลต่างๆ ทั้งภายและภายนอก และสร้างอัลกอริทึมและสร้างต้นแบบ.
- การรวบรวมข้อมูลที่มีโครงสร้าง (Structured Data) และไม่มีโครงสร้าง (Unstructured Data) ประเมินความต้องการและวัตถุประสงค์ทางธุรกิจ.
- จัดระเบียบข้อมูลดิบ นำข้อมูลมาทำความสะอาด (Data Cleansing + Shaping + ETL) รวมถึงจัดเตรียมข้อมูลสำหรับการสร้างแบบจำลอง.
- ทำการ Explore Data ด้วย Data Visualization Tools.
- หาวิธีการปรับปรุงคุณภาพของข้อมูล และสร้างความน่าเชื่อถือของข้อมูลดูแนวโน้มและ Patterns.
- ใช้ AI/ML การทำนาย คาดการณ์ (Predictive) และรายงานผล.
- ปริญญาตรี สาขา วิทยาการคอมพิวเตอร์ วิศวกรรมศาสตร์ คณิตศาสตร์ สถิติ หรือสาขาอื่นที่เกี่ยวข้อง.
- มีประสบการณ์ในการทำ Data Visualization.
- มีประสบการณ์การใช้ Cloud Platform.
- มีประสบการณ์การใช้ Python เช่น Pandas, Numpy, Scikit-learn, Matplotlib and PyTorch.
- มีประสบการณ์ การใช้ Machine Learning เช่น regression, classification, clustering and association.
- มีความเข้าใจในสถิติข้อมูล.
- มีทักษะการนำเสนอ โดยสามารถนำความคิดที่ซับซ้อนมาทำให้เป็นรูปแบบที่เข้าใจง่าย.
- นักสื่อสารที่ดีพร้อมการจัดการผู้มีส่วนได้ส่วนเสียได้.
- มีทักษะการแก้ปัญหา.
Skills:
Automation, Product Owner, Python
Job type:
Full-time
Salary:
negotiable
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
Skills:
ETL, Data Analysis, Industry trends
Job type:
Full-time
Salary:
฿70,000 - ฿90,000, negotiable
- Analyze and interpret complex data sets to uncover insights and trends that drive business strategy and decision-making.
- Collaborate with cross-functional teams to understand their data needs and provide actionable recommendations.
- Design and maintain dashboards, reports, and visualizations using tools to communicate insights effectively.
- Extract data from various sources, including databases, APIs, and third-party services, ensuring data quality and accuracy.
- Develop and implement data models, ETL processes, and automated reporting solutions to streamline data analysis.
- Stay updated with industry trends and new technologies to enhance the company's data analytics capabilities.
- Participate in data governance initiatives, ensuring compliance with data privacy and security regulations.
- Requirements/Qualifications(must have):.
- Bachelor's degree in Statistics, Data Science, or a related field; an MBA or advanced degree is a plus.
- Minimum of 5 years of experience in business intelligence or data analysis, preferably in a fast-paced e-commerce environment.
- Proficient in SQL and at least one data visualization tool (e.g., Tableau, Power BI), with a solid understanding of data warehousing concepts.
- Strong analytical skills, with the ability to manipulate, clean, and derive insights from large datasets.
- Effective communicator with excellent presentation skills, capable of translating complex data into simple, actionable insights for non-technical stakeholders.
Skills:
Big Data, Java, Python
Job type:
Full-time
Salary:
negotiable
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
- Educational.
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
Skills:
ETL, SQL, Hadoop
Job type:
Full-time
Salary:
negotiable
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Master degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experience in Data Management or Data Engineer (Retail or E-Commerce business is preferable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in BigData Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Experience in Generative AI is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
Big Data, ETL, SQL
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
ETL, Python, Java
Job type:
Full-time
Salary:
negotiable
- Design, develop, and maintain scalable data pipelines and ETL processes.
- Implement and optimize data storage solutions, including data warehouses and data lakes.
- Collaborate with data scientists and analysts to understand data requirements and provide efficient data access.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security and access control measures.
- Optimize query performance and data retrieval processes.
- Evaluate and integrate new data technologies and tools.
- Mentor junior data engineers and provide technical leadership.
- Collaborate with cross-functional teams to support data-driven decision-making.
- RequirementsBachelor's or Master's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering or related roles.
- Strong programming skills in Python, Java, or Scala.
- Extensive experience with big data technologies such as Hadoop, Spark, and Hive.
- Proficiency in SQL and experience with both relational and NoSQL databases.
- Experience with cloud platforms (AWS, Azure, or GCP) and their data services.
- Knowledge of data modeling, data warehousing, and ETL best practices.
- Familiarity with data visualization tools (e.g., Tableau, Power BI).
- Experience with version control systems (e.g., Git) and CI/CD pipelines.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
Skills:
ETL, Java, Python
Job type:
Full-time
Salary:
negotiable
- Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals.
- Create data products for analytics and data scientist team members to improve their productivity.
- Lead the evaluation, implementation and deployment of emerging tools and process for analytic data engineering in order to improve our productivity as a team.
- Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes.
- RequirementsPrevious experience as a data engineer or in a similar role.
- Technical expertise with data models, data mining, and segmentation techniques.
- Knowledge of programming languages (e.g. Java and Python).
- Hands-on experience with SQL database design using Hadoop or BigQuery and experience with a variety of relational, NoSQL, and cloud database technologies.
- Great numerical and analytical skill.
- Worked with BI tools such as Tableau, Power BI.
- Conceptual knowledge of data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
Skills:
Business Development, Creative Thinking, Project Management, English
Job type:
Full-time
Salary:
negotiable
- Strong analytical and problem-solving skills to identify commercial opportunity.
- Working well in a cross-disciplinary team with different types of stakeholders (IT, Agency, Business, Management).
- Business Development of Data Intelligence for corporate strategy.
- Analyze internal and external data in various aspects to identify threats & opportunities and provide information/report for management or related business unit team to plan activities and strategies.
- Participate in the initiative's development plan of business unit / brand plans and align with corporate strategy, objectives and KPIs.
- Coordinate and consolidate with the related department to implement a project or tracking the project progress and provide corrective supervision if necessary.
- Create and deliver insights report on new ideas to the management team or business units and seek appropriate decisions, directions, and approvals.
- Bachelor s or Master s degree in business or related field of study.
- Minimum 5-8 years Performance Management function / Commercial Analyst roles.
- Experience in corporate/channel/brand/segment strategy.
- Experience working in Data Analytic related projects.
- Excellent analytical and problem-solving skills.
- Ability to apply logical and creative thinking to solve complex business problems.
- Ability to define the problems and frame answers in a logical and structured manner.
- Good project management, team leadership and sense of ownership.
- Good co-ordination skill with positive attitude and ability to work under pressure.
- Strong communications, customer relationship and negotiation skills.
- Good command of both written and spoken English.
- TECHNICAL SKILLS: Basic understanding of data ecosystem, Advanced skills in dashboard and BI tools.
- Conceptual knowledge of data and analytics, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
Skills:
Automation, SQL, Data Warehousing
Job type:
Full-time
Salary:
negotiable
- Act as the first point of contact for users facing issues related to data and reporting.
- Manage, track, and resolve incidents, service requests, and inquiries via the ticketing system.
- Classify and prioritize incoming tickets based on severity, impact, and urgency.
- Respond to and resolve user tickets in a timely and efficient manner.
- Escalate unresolved or complex issues to appropriate internal teams while maintaining clear communication with the users.
- Diagnose, troubleshoot, and resolve data-related issues, including reporting errors, data discrepancies, and system malfunctions.
- Collaborate with other teams (data engineers, data scientists, data analysts, and other IT teams) to address complex issues.
- Provide clear and comprehensive updates to users on incident status and resolution timelines.
- Provide technical support to end-users via phone, email, chat, and ticketing system.
- Process user requests for new reports, data extracts, or updates to existing data views.
- Coordinate with relevant stakeholders to ensure requests are completed accurately and efficiently.
- Respond to user inquiries about reporting tools, data access, and system functionalities.
- Provide guidance and training to users on self-service reporting tools and best practices.
- Maintain an updated knowledge base for frequently asked questions and user guidance.
- Contribute to the development and maintenance of knowledge base articles.
- Analyze recurring issues and recommend changes to improve system stability and user experience.
- Collaborate with development and data teams to identify opportunities for automation and improved processes.
- Collaborate with other teams to improve system performance and user experience.
- Provide on-call support during evenings, weekends, or holidays as required.
- RequirementsBachelor's degree in Computer Science, Information Technology, or related field.
- Proficiency in SQL and database querying for troubleshooting and resolving data-related issues.
- Strong understanding of database management and concepts.
- Knowledge of data warehousing concepts and ETL processes.
- Experience with business intelligence and data visualization tools (e.g., Power BI, Oracle OBIEE, Oracle BIP).
- Familiarity with data visualization and reporting systems.
- Experience with cloud platforms (AWS, Azure, GCP, or Oracle Cloud).
- Strong analytical and problem-solving skills.
- Excellent communication skills, both written and verbal, with the ability to explain technical concepts to non-technical users.
- Ability to manage multiple tasks, prioritize effectively, and work under pressure.
- Strong customer service orientation and detail-oriented with a focus on delivering high-quality results.
- 1