Situation

Client has a discovery platform that collates unorganized activities and events from across different local providers and organizes them into well-structured format for easy searching through customized recommendations. The problem that the Client was trying to solve for their end user was saving their time which was being spent in endless research for discovery of activities and events.

As a solution, client was doing the research on behalf of end users and providing them with all the information in one place, trying to become a one-stop destination for all their discovery needs. For this, Client had to deploy manual resources who worked for 120 hours a week, tirelessly doing the online research and enriching the backend database. But this was not enough as the market had a very unorganized cycle of updating the event details. As a result, the Client’s database was always lagging as compared to the real information in the market, that too with errors owing to the human element.

Task

How could we minimize the dependency on manual resources for the simple task of data extraction, thereby allocating the time of existing resources in more productive tasks?

How could we design a process that ensure real-time update of database in sync with the market information?

How could we ensure minimal errors in the data leading to delightful customer experience for the end user of our Client?

Action

Priorise developed a detailed data structure of the existing database from scratch to further optimize it.

Inclusive research was done for the information available online to create an information flow in which data extraction took place. This information flow was then optimized for time taken to extract the information.

The data structure was then improved upon by reorganizing the data fields as per the information flow.

Data field were segregated into manual, extracted and derived fields as some information could simply be drawn from raw data. A simple example would be calculating duration from start time and end time.

Thereby eliminating the need to look for that information and enter it manually.

Researching the providers websites to identify the ones that followed a fixed structure or rarely made changes to the data structure, prioritizing the ones that had a significant number of activities on their platform.

Developed an in-house automation solution in the form of a simple to implement coding script. The script simply extracted the data from the target websites and collated it into the Client database.

The collected data mentions were manually analyzed and all of those conversations were categorized into different tags which were defined as key themes followed by sub-themes.

Sentiment Analysis was also performed to understand customer emotions which helped brands prioritize the negative mentions. It also helped them to communicate better with customers and develop more relevant messages.

Gave them a 360-degree view of what's most significant to their business

Helped brands in understanding engagement better and resolve customer queries within a less turnaround time

Insights like customers are willing to pay more when they receive great experience from the brand they follow and trust.

Project Information

Result

  • Average time for data extraction reduced from 2 hours to 3 seconds for Extracted and Derived data fields (account for 70% of total data fields).
  • Data errors dropped from 8% to less than 2%
  • Real-time updates of database became a reality for the client.
  • Better productivity for the team - 50% of their time could be utilized for new partner acquisition as compared to 10% before automation.