✳︎
Analytics
✳︎ Analytics
Survey & Discovering broken RAQ fix→
At a glance:
Managed survey results and discovered the ‘Request a Quote’ submission button malfunctioned. Reported, quantified, and collaborated to solve the user experience malfunction recovering lost leads valued at $25.2MM.
Project Overview
Distributing Surveys to users to improve user experience based on time on site and pages visited.
Flagged and monitored comments regarding filling out the “Request a quote form” and not being followed up with by the company.
Investigated by cross-referencing the time the survey was filled out to find the session in ‘Quantum’- a data management system- to watch experience, tag similar errors, and quantify the occurrence
Synthesized substantial amounts of data (e.g., Google Analytics, Quantum) applying quantitative modeling to analyze and evaluate actionable insights for distinct audience segments and development in terms of website, campaign, usability/accessibility, user behaviors, and preferences for opportunities.
Outcome and Results
The submission button was not working properly. Reported it to my manager and connected with developers to create a fix. Quantified loss done by Quantum team- Recovered lost leads valued at $25.2MM
Optimization - Web Scraper integration→
At a glance:
Overhauled business process by designing and implementing a web scraper to aid in the metric collection of MNopedia details and organizing data in an Excel sheet, increasing collection rate from weeks done manually to less than 5 minutes.
Project Overview
I was told to go through the articles written in the past four years and determine the topic and quality interactions alongside views it had to gain intel on performance and popular subjects.
This was traditionally done every three to four years manually. Due to the sheer volume of articles the process was long. I asked for a week to experiment creating something that could do it automatically and set out to create a web scraper.
Assets
I had learned about web scrapers previously when doing market research. Most sites block you if they know you’re running software to gain intel on their site without paying, but being on the backend allowed for more details on pages and access to run the code freely.
Outcome and Results
It extremely well. I did additional research on recreating it with extensions so the work could be done by anyone- without having previous knowledge of the code or programming in general. I included a visual guide on their drive with the instructions. I left additional notes on how it can be further used to monitor which pages could use an SEO boost, how to make those edits on the back end, and how to use the web scraper to create a word cloud for additional data analysis using the results.