132 lines
7.0 KiB
Markdown
132 lines
7.0 KiB
Markdown
Title: A Resume
|
||
Date: 2024-02-23 20:00
|
||
Modified: 2024-03-13 20:00
|
||
Category: Resume
|
||
Tags: Cover Letter, Resume
|
||
Slug: resume
|
||
Authors: Andrew Ridgway
|
||
Summary: A Summary of My work Experience
|
||
|
||
# OVERVIEW
|
||
I am a Senior Data Engineer looking to transition my skills to Data and Solution
|
||
Architecting as well as project management. I have spent the better part of the
|
||
last decade refining my abilities in taking business requirements and turning
|
||
those into actionable data engineering, analytics, and software projects with
|
||
trackable metrics. I believe in agnosticism when it comes to coding languages
|
||
and have experimented in my own time with many different languages. In my
|
||
career I have used Python, .NET, PowerShell, TSQL, VB and SAS (multiple
|
||
products) in an Enterprise capacity. I also have experience using Google Cloud
|
||
Platform and AWS tools for ETL and data platform development as well as git
|
||
for version control and deployment using various IAC tools. I have also
|
||
conducted data analysis and modelling on business metrics to find relationships
|
||
between both staff and customer behavior and produced actionable
|
||
recommendations based on the conclusions. In a private context I have also
|
||
experimented with C, C# and Kotlin I am looking to further my career by taking
|
||
my passion for data engineering and analysis as well as web and software
|
||
development and applying it in a strategic context.
|
||
|
||
# SKILLS & ABILITIES
|
||
- Python (scripting, compiling, notebooks – Sagemaker, Jupyter)
|
||
- git
|
||
- SAS (Base, EG, VA)
|
||
- Various Google Cloud Tools (Data Fusion, Compute Engine, Cloud Functions)
|
||
- Various Amazon Tools (EC2, RDS, Kinesis, Glue, Redshift, Lambda, ECS, ECR, EKS)
|
||
- Streaming Technologies (Kafka, Hive, Spark Streaming)
|
||
- Various DB platforms both on Prem and Serverless (MariaDB/MySql,
|
||
- Postgres/Redshift, SQL Server, RDS/Aurora variants)
|
||
- Various Microsoft Products (PowerBI, TSQL, Excel, VBA)
|
||
- Linux Server Administration (cron, bash, systemD)
|
||
- ETL/ELT Development
|
||
- Basic Data Modelling (Kimball, SCD Type 2)
|
||
- IAC (Cloud Formation, Terraform)
|
||
- Datahub Deployment
|
||
- Dagster Orchestration Deployments
|
||
- DBT Modelling and Design Deployments
|
||
- Containerised and Cloud Driven Data Architecture
|
||
|
||
# EXPERIENCE
|
||
## Cloud Data Architect
|
||
### _Redeye Apps_
|
||
#### _May 2022 - Present_
|
||
- Greenfields Research, Design and Deployment of S3 datalake (Parquet)
|
||
- AWS DMS, S3, Athena, Glue
|
||
- Research Design and Deployment of Catalog (Datahub)
|
||
- Design of Data Governance Process (Datahub driven)
|
||
- Research Design and Deployment of Orchestration and Modelling for Transforms (Dagster/DBT into Mesos)
|
||
- CI/CD design and deployment of modelling and orchestration using Gitlab
|
||
- Research, Design and Deployment of ML Ops Dev pipelines anddeployment strategy
|
||
- Design of ETL/Pipelines (DBT)
|
||
- Design of Customer Facing Data Products and deployment methodologies (Fully automated via Kakfa/Dagster/DBT)
|
||
|
||
## Data Engineer,
|
||
### _TechConnect IT Solutions_
|
||
#### _August 2021 – May 2022_
|
||
- Design of Cloud Data Batch ETL solutions using Python (Glue)
|
||
- Design of Cloud Data Streaming ETL solution using Python (Kinesis)
|
||
- Solve complex client business problems using software to join and transform data from DB’s, Web API’s, Application API’s and System logs
|
||
- Build CI/CD pipelines to ensure smooth deployments (Bitbucket, gitlab)
|
||
- Apply Prebuilt ML models to software solutions (Sagemaker)
|
||
- Assist with the architecting of Containerisation solutions (Docker, ECS, ECR)
|
||
- API testing and development (gRPC, Rest)
|
||
|
||
## Enterprise Data Warehouse Developer
|
||
### _Auto and General Insurance_
|
||
#### _August 2019 - August 2021_
|
||
- ETL development of CRM, WFP, Outbound Dialer, Inbound switch in Google Cloud, SAS, TSQL
|
||
- Bringing new data to the business to analyse for new insights
|
||
- Redeveloped Version Control and brought git to the data team
|
||
- Introduced python for API enablement in the Enterprise Data Warehouse
|
||
- Partnering with the business to focus data project on actual need and translating into technical requirements
|
||
|
||
## Business Analyst
|
||
### _Auto and General Insurance_
|
||
#### _January 2018 - August 2019_
|
||
- Automate Service Performance Reporting using PowerShell/VBA/SAS
|
||
- Learn and leverage SAS EG and VA to streamline Microsoft Excel Reporting
|
||
- Identify and develop data pipelines to source data from multiple sources easily and collate into a single source to identify relationships and trends
|
||
- Technologies used include VBA, PowerShell, SQL, Web API’s, SAS
|
||
- Where SAS is inappropriate use VBA to automate processes in Microsoft Access and Excel
|
||
- Gather Requirements to build meaningful reporting solutions
|
||
- Provide meaningful analysis on business performance and provide relevant presentations and reports to senior stakeholders.
|
||
|
||
## Forecasting and Capacity Analyst
|
||
### _Auto and General Insurance_
|
||
#### _January 2017 – January 2018_
|
||
- Develop the outbound forecasting model for the Auto and General sales call center by analysing the relationship between customer decisions and workload drivers
|
||
- This includes the complete data pipeline for the model from identifying and sourcing data, building the reporting and analysing the data and associated drivers.
|
||
- Forecast inbound workload requirements for the Auto and General sales call center using time series analysis
|
||
- Learn and leverage the Aspect Workforce Management System to ensure efficiency of forecast generation
|
||
- Learn and leverage the capabilities of SAS Enterprise Guide to improve accuracy
|
||
- Liaise with people across the business to ensure meaningful, accurate analysis is provided to senior stakeholders
|
||
- Analyse monthly, weekly and intraday requirements and ensure forecast is accurately predicting workload for breaks, meetings and Leave
|
||
|
||
## Senior HR Performance Analyst
|
||
### _Queensland Department of Justice and Attorney General_
|
||
#### _June 2016 - January 2017_
|
||
- Harmonise various systems to develop a unified workforce reporting and analysis framework with appropriate metrics
|
||
- Use VBA to automate regular reporting in Microsoft Access and Excel
|
||
- Participate in government process through the production of briefs including Questions on Notice and Estimates Briefs for departmental executives
|
||
|
||
## Workforce Business Analyst
|
||
### _Queensland Department of Justice and Attorney General_
|
||
#### _July 2015 – June 2016_
|
||
- Develop and refine current workforce analysis techniques and databases
|
||
- Use VBA to automate regular reporting in Microsoft Access and Excel
|
||
- Act as liaison between shared service providers and executives and facilitate communication during the implementation of a payroll leave audit
|
||
- Gather reporting requirements from various business areas and produce ad-hoc and regular reports as required
|
||
- Participate in government process through the production of briefs including Questions on Notice and Estimates Briefs for departmental executives
|
||
|
||
# EDUCATION
|
||
- 2011 Bachelor of Business Management, University of Queensland
|
||
- 2008 Bachelor of Arts, University of Queensland
|
||
|
||
# REFERENCES
|
||
- Anthony Stiller Lead Developer, Data warehousing, Queensland Health
|
||
|
||
_0428 038 031_
|
||
|
||
- Jaime Brian Head of Cloud Ninjas, TechConnect
|
||
|
||
_0422 012 17_
|
||
|