I'm Rodolfo Mendivil

I'm Experienced in

Rodolfo Mendivil
About Me

Who Am I?

My name is Rodolfo Mendivil, and I bring over 8 years of experience in Data Analytics, Business Intelligence, Data Engineering, and Data Science. Throughout my career, I have specialized in building and optimizing data pipelines and architectures, designing and implementing scalable solutions for data storage, integration, and management. My commitment to driving data-first initiatives has consistently optimized data delivery for cross-functional teams.

My expertise lies in leveraging advanced technologies and best practices to ensure efficient, reliable, and secure data operations. I have a proven track record of improving data accessibility and quality, enabling more informed decision-making across organizations. I am passionate about contributing to projects aimed at transforming operations into data-driven organizations, where I can apply my knowledge to innovate and enhance data strategies, streamline workflows, and support the achievement of business goals through data-driven insights.

I am eager to explore new opportunities to contribute my skills and experience to projects that prioritize data-driven transformations.

Technologies
Hero
Certifications
Hero
My Services
Hero

Data Pipeline Development

ETL/ELT Pipeline Design and Implementation

Automated Data Ingestion from Various Sources

Real-Time Data Processing Automation

Automated Data Cleaning and Transformation

Workflow Automation and Scheduling

Hero

Data Warehousing Solutions

Automated Data Warehouse Architecture Design

Automated Data Lake Implementation

Database Schema Design and Optimization

Automated Cloud Data Warehousing (e.g., AWS Redshift, Google BigQuery, Azure Synapse)

Automated Data Integration from Multiple Sources on and Scheduling

Hero

Big Data Technologies

Automated Hadoop Ecosystem Management

Automated Spark and Distributed Computing

NoSQL Databases (e.g., Cassandra, MongoDB) with Automation

Automated Data Stream Processing (e.g., Kafka, Flink)

Scalable Storage Solutions with Automation

Hero

Data Quality and Governance

Automated Data Quality Assessment and Improvement

Automated Data Lineage and Metadata Management

Data Governance Strategy with Automation

Master Data Management (MDM) Automation

Automated Compliance and Data Security

Hero

Data Analytics and BI Solutions

Business Intelligence (BI) Tools Integration with Automation (e.g., Tableau, Power BI)

Automated Data Visualization and Reporting

Advanced Analytics and Predictive Modeling Automation

Self-Service Analytics Solutions with Automation

Automated Dashboard Development

Hero

Cloud Data Engineering

Automated Cloud Platform Setup and Management (AWS, GCP, Azure)

Automated Cloud Migration Services

Serverless Data Processing Automation (e.g., AWS Lambda, Google Cloud Functions)

Containerization and Orchestration Automation (e.g., Docker, Kubernetes)

Cost Optimization and Performance Tuning with Automation

PROJECTS

Aeromexico BI Groups and Connections Revenue Analysis

name:'Aeromexico BI Groups and Connections Revenue Analysis',
tools: ['Amazon S3', ', Databricks (with Apache Spark, Delta Lake)', '), AWS Glue', 'AWS Step Functions', 'Amazon Redshift', 'Power BI],
myRole:BI Data Engineer,
Description: In this Aeroméxico BI project, data from various sources, including flight operations and ticket sales, is ingested into Amazon S3. Databricks, powered by Apache Spark and Delta Lake, processes the data in batch jobs to transform and clean it for further analysis. AWS Glue and Step Functions orchestrate the ETL pipeline, ensuring smooth data flows. The processed data is stored in Amazon Redshift, where it is analyzed for revenue insights, particularly in connection revenue. Power BI is used to visualize these insights, helping the BI team track key performance metrics related to connections and revenue optimization. impressive logic design to calculate date and time arriving for flights worldwide.,
};

Coca Cola Latam DI BI Dashboards and Reports

name:'Coca Cola Latam DI BI Dashboards and Reports',
tools: ['Azure Data Factory (ADF)', 'Azure Blob Storage', 'Azure SQL Database', 'Synapse Analytics', 'Azure Data Lake', 'Power BI', 'Azure Analysis Services', 'Azure Monitor / Log Analytics', 'Azure DevOps', 'Azure Active Directory (AAD)', 'Power Query', 'DAX (Data Analysis Expressions)],
myRole:BI Data Engineer,
Description: The project aims to analyze Coca-Cola’s D.A. using Power BI for visualization, supported by an Azure-based data pipeline. Azure Data Factory orchestrates data ingestion from various sources into Azure Blob Storage for raw data storage. Azure SQL Database or Synapse Analytics provides structured data processing, while Azure Data Lake stores large datasets. Power BI integrates data from these sources, using Power Query for data transformation and DAX for calculations. Azure Analysis Services supports advanced data modeling. Azure Monitor/Log Analytics ensures performance tracking, with Azure DevOps handling CI/CD, and Azure Active Directory securing access. Each tool ensures scalability, security, and efficiency—amazing results over covering project scope.,
};

TupperWare Order Fullfilment ML Project

name:'TupperWare Order Fullfilment ML Project',
tools: ['Azure Data Factory (ADF)', 'Azure Blob Storage', 'Azure SQL Database / Synapse Analytics', ', Azure Data Lake', 'Azure Databricks', 'Azure Machine Learning', 'Power BI', 'Azure Analysis Services', 'Azure DevOps', 'Azure Monitor / Log Analytics', ', Azure Active Directory (AAD)],
myRole:BI Data Engineer,
Description: The Tupperware project involves processing files from multiple sources in a batch pipeline using Azure. Azure Data Factory orchestrates data ingestion, with Azure Blob Storage storing raw files and Azure Data Lake handling large datasets. Databricks is used for data transformation and machine learning model development to enhance order fulfillment. Azure SQL Database or Synapse Analytics supports structured data storage for reporting. Power BI delivers BI analysis with advanced data modeling from Azure Analysis Services. Azure DevOps manages CI/CD, and Azure Monitor/Log Analytics ensures pipeline performance. Azure Active Directory secures access, ensuring efficient, scalable, and secure operations.,
};

INTER Data Migration

name:'INTER Data Migration',
tools: ['Azure Database Migration Service (DMS)', 'Azure Data Factory (ADF)', 'Oracle Data Pump', 'Azure Blob Storage', 'Azure SQL Database / Synapse Analytics', 'Azure Data Lake', 'Azure Monitor / Log Analytics', 'PowerShell / Oracle SQL*Plus],
myRole:BI Data Engineer,
Description: The project focuses on migrating data from an Oracle database to Azure for INTER Insurance Company. Azure Database Migration Service (DMS) was chosen to automate and streamline the migration, minimizing downtime. Azure Data Factory (ADF) manages the data pipeline, extracting Oracle data and loading it into Azure. Oracle Data Pump exports data in a portable format, temporarily stored in Azure Blob Storage. Azure SQL Database or Synapse Analytics stores the migrated data, while Azure Data Lake manages large datasets. PowerShell and Oracle SQL*Plus automate tasks, and Azure Monitor/Log Analytics ensures the migration runs smoothly.The selected tools offer scalability, reliability, and minimal business impact, making them ideal for handling the sensitive and high-volume data of an insurance company. The project achieved a 98% reduction in downtime, enabling INTER Insurance to maintain business continuity and improving data accessibility for analytics and reporting.,
};
Testimonials

José Alejandro Gutiérrez

Thomson Reuters

Rodolfo Mendívil is an outstanding Data Engineer who played a pivotal role in our data transformation projects at Thomson Reuters. Rodolfo's attention to detail, problem-solving skills, and ability to handle complex datasets were truly impressive. He consistently delivered high-quality work under tight deadlines and was a pleasure to collaborate with. I highly recommend Rodolfo for any data engineering role; his expertise and professionalism are exceptional.
Blogs

2 months ago

1

“Unlocking Cost-Efficiency and Scalability: How Snowflake is Revolutionizing Data Management for Modern Businesses”

5 Min Read

As a business leader, you’re always looking for ways to optimize your data infrastructure while...

2 months ago

1

*Mastering Informatica Intelligent Cloud Services (IICS) for Cloud Data Integration*

3 Min Read

As organizations increasingly move their data and applications to the cloud, mastering cloud-native...

CONTACT ME

If you have any questions or concerns, please don't hesitate to contact me. I am open to any work opportunities that align with my skills and interests.

rodolfo@mendivil-data.pro

Maxico cell: +526861197876

US cell : +16194003376

421 Broadway Suite 5015 San Diego California 92101

© All Rights Reserved. Rodolfo Mendivil.