AVP-Data Warehouse & Business Intelligence Engineering
Location: ID
Level: Managerial
Employment Status: Permanent
Department: Group Digital Engineering and Transformation
Role Purpose
To LEAD, OVERSEE, and GUIDE Data Warehouse, Big Data, and Business Intelligence (BI) Engineering activities for end-to-end business solutions, enterprise-scale analytics, and modern data architectures. This role ensures high-performance ETL, reporting, dashboarding, and data visualization while driving automation, cloud transformation (GCP), and Hadoop ecosystem optimization.
Role & Responsibilities: Data Warehouse and Business Intelligence Engineering
Strategic Leadership & Governance
- End-to-End Data & Analytics Strategy: Lead data-driven decision-making by aligning Big Data, BI, and Cloud initiatives with business objectives.
- AGILE & WATERFALL Development: Manage bimodal IT development models to balance rapid innovation with stability and compliance in data projects.
- Vendor & Stakeholder Management: Lead cross-functional teams, including BI Developers, Cloud Data Engineers, Big Data Engineers, and Vendor Partners, ensuring SLA compliance and high-quality data services.
Big Data, Hadoop, and Data Engineering
- Hadoop Ecosystem Leadership: Strong expertise in Hadoop (HDFS, Hive, Spark, Impala, HBase, Kafka, Oozie, Sqoop) to ensure efficient, scalable, and fault-tolerant big data solutions.
- Optimized Data Processing: Implement high-throughput data pipelines for batch & real-time analytics using Spark, Flink, and Kafka Streams.
- Data Management: Optimize Data architectures using Hadoop, GCS, Iceberg, or BigQuery.
Business Intelligence, Reporting & Data Visualization
- Enterprise-Grade Reporting & Dashboarding: Design and deliver scalable, real-time BI dashboards using Tableau, Power BI, Looker, Data Studio, or other tools.
- Self-Service BI Enablement: Develop ad-hoc reporting capabilities to empower business users and reduce dependency on IT.
- Data Modeling for BI: Data marts to enhance BI performance and analytical insights.
- Data Governance & Quality Assurance: Enforce data integrity, lineage tracking, and metadata management for consistent and trusted reporting.
Cloud Data Engineering & GCP Capabilities
- BigQuery Optimization: Leverage partitioning, clustering, and materialized views for cost-effective and high-speed queries.
- ETL & Orchestration: Develop robust ETL/ELT pipelines using Cloud Data Fusion, Apache Beam, Dataflow, and Airflow.
- Hybrid Cloud & On-Prem Integration: Seamlessly integrate Hadoop-based Big Data systems with GCP, on-premises databases, and legacy BI tools.
BI DevOps, Automation & Innovation
- BI DevOps & Continuous Delivery: Implement CI/CD pipelines to accelerate BI feature releases, ETL deployments, and dashboard updates.
- Data Observability & Quality Monitoring: Ensure end-to-end monitoring of data pipelines, anomaly detection, and real-time alerting.
- AI/ML Integration for BI: Apply predictive analytics and AI-driven insights to enhance business intelligence and reporting.
- Bottleneck Identification & Resolution: Proactively identify and eliminate performance issues in Hadoop clusters, ETL pipelines, and BI reporting layers.
Minimum Requirements
Qualification: Minimum University Degree (S1), Preferable Study area in Information Technology, Computer, Electrical, Telecommunication, or Mathematics/Statistics.
Experience: At least 5 years of experience in the full cycle process in Data warehouse, Business Intelligence, and analytics & Reporting area. Preferable has experience in the telecommunication industry. Experience managing a team is an advantage.
Skills:
- Very good analytical thinking and problem-solving for effective identification of business problems, understanding of stakeholder’s needs, and assessment and formulation of the solution.
- Very good communication skills in Indonesian and English.
- Very good skills in technical writing and reporting.
- Very good presentation and persuasion skills.
- Very good collaboration skills with many stakeholders.
- Very good knowledge in Data Warehousing, Big Data, and BI architecture, technology, design, development, and operation.
- Good knowledge of the telecommunication business in general.
- Experience and knowledge to process CDR from Telco’s system e.g. Charging and Billing, GGSN, MSC, VLR, SMSC, etc.
- Experience managing DWH & BI project teams and operations.
- Experience working with near real-time data, huge data volumes, and unstructured data processing.
- Familiar and hands-on with the following technology stack:
- Big Data & Hadoop: HDFS, Hive, Spark, Impala, HBase, Kafka, Oozie, Sqoop, Presto, Kudu, Cloudera, Hortonworks, EMR.
- BI & Reporting: Tableau, Power BI, Looker, etc.
- Cloud & Data Warehousing: GCP (BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage).
- ETL & Orchestration: Apache Beam, Dataflow, Cloud Data Fusion, Airflow, Informatica.
- DevOps & Automation: Terraform, GitOps, Kubernetes, CI/CD for DataOps.
- Good knowledge in IT infrastructure in the areas of Server, Storage, Database, Backup System, Desktop, Local/Wide Area Network, Data Center, and Disaster Recovery.
- Good knowledge in Agile Development (Scrum), Business Process Framework (eTOM), Application Framework (TAM), and Information Framework (SID).
Scope
Area of Responsibilities
- Strategic Planning: Translate the Company and Group Vision, Mission, and Strategy into divisional work plan especially in DWH & BI Area. Provide and clarify divisional objectives in DWH & BI Area for BI Developer team.
- Applications & Services Development: Assess, define, and provide the solution for system and service development related to Big Data, Business Intelligence, and Data Warehouse (DWH), Hadoop, GCP, and Enterprise systems. Manage the solution as a reference for Development and Configuration activities. Manage Application Developers and Vendor Partners for Agile and Waterfall (two-speed operational approaches) for Solution and Development of the required application and services.
- Quality Assurance & Release Management for Agile Delivery: Ensure quality assurance of BI delivery by performing testing and monitoring data integrity & completeness of data warehouse for analytics and reporting. Release Management for Agile Development Delivery.
- Operational: Support Hadoop & DWH Managed Service Vendor to deliver daily reporting on time, correct, complete, and achieve SLA as part of SOW of Hadoop Managed Service contract.
- DevOps Automation: Promote and implement best practice automation DevOps automation tools that enable DevOps team to develop and deploy impeccable applications and services by performing an automated delivery process with DevOps CI/CD to reduce bureaucracy in the approval process, accelerate development, and delivery quality. Automation process will cover among others automated demand management, versioning control, automated build, automated test, release management, and automated deployment.
- Subject Matter Expert (SME): Serve as Subject Matter Expert in BI Area for technical Business Solution and Delivery support to ensure that the project deliverables are in accordance with the defined solution and time plan. Monitor quality of data for existing process/system monitoring or after activation/launching new product/system to measure business, revenue, and customer experience impact. Ensure content of data in Hadoop, GCP, and BI Report.
- Team Management: Lead, oversee, and guide all BI Developer activities. Develop and mentor BI Developer team members to achieve divisional objectives.