rahmad07g

About Me

Rahmad Gunawan is an accomplished urban planner with over two years of experience in technical writing, spatial analysis, and data visualization. He recently completed Hacktiv8 Bootcamp (Full Time Data Science) and is transitioning into a Data Engineer role. Gunawan has demonstrated expertise through various research projects focused on urban development and planning, and excels at utilizing computer science, modeling, and statistics to extract and analyze data, providing actionable insights. Currently, Gunawan is working as a Data Engineer and applying his skills to design and execute solutions for clients.

TECHNICAL SKILLS

Programming Language: Python, SQL, MariaDB (Exploring), PostgreSQL, Clickhouse

Libraries: Matplotlib, Numpy, Pandas, Plotly, Seaborn, Scikit-learn, Streamlit, Tensorflow

Tools: Apache Air Flow, DBeaver, Google Colab, Github, Grafana, Tableau, Google Data Studio (Exploring), Heroku, Docker (Exploring), Hadoop, Visual Studio Code, ArcGIS Pro

Languages: Fluent in Bahasa, Professional Working Proficiency in English

I love connecting with new people, give me a shout at [email protected] or here on Linkedin!
here is my GitHub https://github.com/rahmad07g, I have attached my resume for your reference/review :
https://bit.ly/Ragun_Resume

Education

Bachelor Urban and Regional Planning 2022

Sumatera Institute of Technology

I am a graduate in Urban and Regional Planning with a focus on urban area planning and expertise in spatial skills. I have worked on several projects related to planning documents. Thesis: "The Principle of Designing Public Open Spaces as Social Activities in Slum Areas: Case Study of Karang City Village - Bandar Lampung City

Full Time Data Science (FTDS) Batch 013 2022

Bootcam Hacktiv8

Courses : • Fundamental Python for Data Science, from Data type to Function • SQL Basic for processing simple assignment • Google Cloud Platform / GCP - BigQuery for collecting data to be analyzed • Math (Calculus and Linear Algebra) for understanding How Machine Learning and Neural Network • Statistic (Descriptive & Inferential Statistics, Probability) for understanding how to handle data distribution • Exploratory Data Analysis (Exploring and Visualizing) • Feature Engineering like Handling Outliers, Handling Missing Values, Imbalanced Data • Machine Learning and Deep Learning for the advanced task in Data Science • Computer Vision, Nature Language Processing, and Recommender System • Deployment with Streamlit, Flask, and TF-Serving

Work & Experience

Asia Pulp & Paper logo Specialist IT Supply Chain Automation

Asia Pulp and Paper

03/28/2023

Enhanced Data Accuracy: - Spearheaded the implementation of data validation and cleansing techniques, elevating data accuracy in the FMS system from 30-40% to an impressive 60-75%. - Executed a comprehensive data quality improvement strategy, resulting in more reliable and precise data for informed decision-making. Boosted Operational Efficiency: - Automated monthly and weekly report generation using Python pandas, slashing the time required from 2-3 hours down to a mere 15 minutes. - Streamlined the reporting process, optimizing productivity and resource allocation. Strengthened Fraud Detection and Security Measures: - Devised and implemented an automated monitoring system in the FMS Coal platform, effectively tracking lead time and detecting instances of vehicles deviating from designated routes. - Bolstered security protocols, mitigating the risk of fraudulent activities during transportation operations. Empowered User Proficiency and Data-Driven Decision-Making: - Provided comprehensive technical support and conducted tailored training sessions, equipping end users with the skills to proficiently analyze and visualize data through the user-friendly dashboards. - Facilitated data-driven decision-making in supply chain management, enabling stakeholders to leverage the full potential of the dashboards for strategic insights.Enhanced Data Accuracy: - Spearheaded the implementation of data validation and cleansing techniques, elevating data accuracy in the FMS system from 30-40% to an impressive 60-75%. - Executed a comprehensive data quality improvement strategy, resulting in more reliable and precise data for informed decision-making. Boosted Operational Efficiency: - Automated monthly and weekly report generation using Python pandas, slashing the time required from 2-3 hours down to a mere 15 minutes. - Streamlined the reporting process, optimizing productivity and resource allocation. Strengthened Fraud Detection and Security Measures: - Devised and implemented an automated monitoring system in the FMS Coal platform, effectively tracking lead time and detecting instances of vehicles deviating from designated routes. - Bolstered security protocols, mitigating the risk of fraudulent activities during transportation operations. Empowered User Proficiency and Data-Driven Decision-Making: - Provided comprehensive technical support and conducted tailored training sessions, equipping end users with the skills to proficiently analyze and visualize data through the user-friendly dashboards. - Facilitated data-driven decision-making in supply chain management, enabling stakeholders to leverage the full potential of the dashboards for strategic insights. Skills: Jupyter · SQL Server Management Studio

Data Engineer

Lenna.ai

09/14/2022 - 03/27/2023

Experienced Data Engineer proficient in managing and transforming diverse data sets of 1 billion records from multiple tables. Skilled in Python and Apache Airflow for automating data processes, ensuring streamlined operations and time savings. Expertise in PostgreSQL and ClickHouse databases for efficient storage, retrieval, and optimized data flow. Proficient in data visualization using Grafana, enabling data-driven decision-making. Projects: • Telkomsel's Duta Area Dashboard: Leveraged data on payloads, traffic, revenue, VLR, and POI to provide comprehensive insights into operational performance, facilitating informed decision-making. • CTO/NHI Project: Developed national-level dashboards for Radio Hygiene Index, Transport Hygiene Index, and Service Hygiene Index, enabling proactive monitoring and maintenance of Telkomsel's network infrastructure for optimal performance and customer satisfaction. Key Achievements: • By implementing Apache Airflow for data transformation, I significantly improved the efficiency and reliability of data workflows, resulting in streamlined operations and substantial time savings. • Leveraging PostgreSQL and ClickHouse, I successfully managed and optimized the storage, retrieval, and flow of vast amounts of data, ensuring high-quality aggregation and enabling accurate reporting and analysis. • Through my involvement in the Duta Area Dashboard project, I provided Telkomsel with actionable insights into crucial operational metrics, contributing to improved decision-making and performance optimization. • Successfully completed 100% of assigned tasks and improvements, meeting or exceeding project deadlines, while contributing to the overall progress and success of the team. Made significant contributions to project completion, efficiently utilizing time and resources to advance the project from 20% to 100% within a remaining timeframe of 4 months.

Skills

Data Engineer
80%
Python Programming Language
80%
SQL
90%
Data Visualization
85%
ArcGIS
85%