Case Study
How Spearphish.io Built a User-friendly Cross-Platform Solution for Database Marketing with Scopic and AWS Solutions
About Spearphish
Spearphish.io empowers businesses with its datamining toolkit, including a data engine and marketing toolkit. Its datamining and data engine tools enable businesses to not only manage their CRM, website and lead source data, but also append and enrich their existing database. Spearphish.io’s marketing toolkit offers solutions such as tracking anonymous web traffic, email blast capabilities, and video emails.
The Challenge
Spearphish.io recognized the complexities that seasoned marketers encounter in the realm of complex data, advanced settings, technologies, and tools. To solve this problem, Spearphish.io envisioned a user-friendly datamining toolkit that streamlines customer data, ad budget management and campaign optimization for marketers, that aims to ingest data from different source providers and streamline campaigns in different ways like emails or ads from social media platforms. To implement its vision, Spearphish.io teamed up with Scopic for a flawless implementation of AWS solutions.
To implement multitenancy in the project, the creation of a centralized application was required to control all stores and their configurations from a single, unified platform.
The technical challenges to achieve this were:
- Validating the incoming data and structuring it with an enhanced mapping logic while also avoiding duplicates
- Implementing isolated database instances for distinct store entities
- Implementing a Single-page, multi-filter dynamic customer search interface and allowing users to save custom filters
- Sending a big amount of e-mails for each campaign
- Extracting detailed campaign reports
The Solution
Scopic made sure to successfully select and implement the right AWS solutions to support Spearphish.io in building the user-friendly, cross-platform datamining toolkit they envisioned. To achieve this, we built the cloud structure based on several AWS services to cover different aspects like data storage, files, computing, emails, and logging, while ensuring efficiency.
To address our challenges on the data side – we used RDS to store real-time data used in the application’s main Query page. Each store was given its own managed relational database. This translated into less administrative overheads for tasks such as backups, patch management and scaling. Using RDS eliminated the need for manual database management, ensuring data integrity, security, and scalability. The ability to have separate databases for each store helps in organising and isolating data effectively. In addition, S3 was used to store the files that were pending processing, archives and exports as well as other types of data. It simplified the file storage and retrieval while offering a cost-effective and scalable solution.
To generate charts for reporting purposes as well as to support the multi-tenant architecture, EC2 Instances were used to run Docker for multiple-applications or independent servers such as Puppeteer, and since plenty of heavy jobs are scheduled and executed periodically in this application, we used ECS (with Fargate) to run background jobs in separate tasks. This proved an optimal solution for running containerized workloads.
Another essential service to use in this application was Amazon Route53 for the DNS Management, because each store in the platform was required to have a dedicated domain, and each store needed to be able to send its own e-mails.
We used Amazon CloudWatch for monitoring and logging. It also helped us debugging either the main application or background jobs run by ECS tasks.
AWS CloudFormation was employed to define and provision the AWS infrastructure as code. It allowed us to create, update, and delete resources in a controlled and automated way, enhancing the overall infrastructure lifecycle management. To further help us in our development and deployment process, we used AWS CodeBuild, CodeDeploy and CodePipeline. This allowed us to be more productive while ensuring the Blue/Green and zero downtime deployment.
Other AWS services were also used (for specific use-cases) in this application including: Lambda, API GateWay, System Parameters, Firewall, etc.
List of AWS Services Used:
- AWS RDS (Relational Database Service)
- AWS S3 (Simple Storage Service)
- EC2 (Elastic Compute Cloud)
- AWS ECS (Elastic Container Service)
- AWS Lambda & API Gateway
- AWS CodeBuild, CodeDeploy and CodePipeline:
- AWS Route 53
- AWS CodeBuild, CodeDeploy and CodePipeline
- AWS Cloudformation
- Other AWS services: API GateWay, System Parameters, Firewall, etc. are used as well for specific use cases.
The Result
Thanks to the efficient use of AWS services, requirements such as data storage, file management, computing, emails, and logging, necessary to build the data mining platform, were all addressed.
Spearphish.io now boasts a secure, scalable, and organized database. There is no longer a need to manage the database manually. They benefit from a simplified, cost-effective, and scalable file storage and retrieval solution.
Have more questions?
Talk to us about what you’re looking for. We’ll share our knowledge and guide you on your journey.