Web-Based Platform to Manage and Automate Event Management Services
The UK-based client offers a comprehensive BI (Business Intelligence) solution to its customers, which are mainly event organizers and marketing managers. The approach to serve them is through different business intelligence techniques. The client helps their customers to achieve higher visitor attendance and better conversions from their events by providing various insights using cutting edge analytics algorithm. Customer organizes different paid as well as free-to-attend events for which our client provides a cloud-based platform to manage and automate event management using different types of services.
The client used to manage the audience and exhibitor data, analytics and insights, reporting, charting and other software services. In this process, some of the major challenges faced by the client are as follows:
Gathering data from different sources
An automated platform to assist the companies for organizing future events
Providing analytics and insights on different events held by organizers
Briefly stating, there was a requirement of a system that can provide analytic services and other important insights on different events through a single platform.
Azilen Technologies analyzed the requirements and came up with a solution on AWS cloud. Azilen preferred AWS cloud technology as it allows managing the infrastructure easily. Besides that, there is no need to invest much in to hardware and worry about running that hardware flawlessly. Keeping these factors in mind, AWS cloud technology was used that further involved a series of technical features for our client.
The platform is developed as monolith or as two services. It involves web application and back end apps to work simultaneously. The backend apps can be separated out later as individual services based on the context and its uses. Some of the major services that it offers are User and Event Management Service, Reporting Service, Analytics Service, Selections Service and Notification Service.
The Data Integration Service or the ETL (Extract, Transform and Load) service is being used for integrating various types of heterogeneous data sources in the system. It uses Mule ESBa light weight Java-based integration platform that allows quick and easy connection to applications for enabling data exchange. The Mule ESB helps in easy integration of existing systems, regardless different technologies that an application uses like Web Services, JMS, HTTP and many more.
The monolith service calls AWS S3 that stores data in encrypted format. This is then passed to AWS Lambda that triggers the action and checks the data value validation. It passes both valid and invalid data to the Mule ESB in separate files.
The Single RDBMS database stores the entire application data and client specific configuration. All the client data will be stored into NoSQL database. Based on the nature of the data and the data access patterns MongoDB / Neo4j or even both can be opted. This even helps in real time data analysis of a service.
The user interface is implemented into web application using the JQuery and AngularJS technology. The D3js and Kendo UI are used along with RESTFull web service for dashboard and chart representations. Besides that, the JMS- Asynchronous notification queue is used to by notification service provider for the events taking place in the system.
The HTTP caching of static data and compression is being used for increasing the UI performance. This can be done with CSS in lining and JS magnification using the Grunt tasks
The marketing managers and the third party API users interacting with the system using Web Application and REST services that are managed behind load balancer. After authentication of the user or customer, they can use the services or apps after going through Local Load Balancers and authentication and authorization layer.
Tools & Technologies
- Easy to handle unpredictable load conditions
- Request is served and accessed from single availability zone instead of cross availability zone
- All applications and databases are being deployed into private sub-net
- The system is designed by the interface using loose coupling technique wherever possible
- It is important to close open resources to avoid memory leaks
- Use executor service abstraction to execute threads directly into an application
- The solution is highly secure and scalable
- Cache is used to store temporary data instead of HttpSession, HttpServletRequest or HttpServletResponse