Tag: analytics

Optimizing Your Customer Experience Management

Posted by on August 15, 2017

concert-768722_1920

A customer’s experience with your organization may, in fact, be more important than the quality of either your products or your services. Customers today want to feel valued — they want to be able to have their needs both anticipated and fulfilled. Improving upon and optimizing your customer’s experiences is called customer experience management. Through new technologies, there are many ways that you can improve upon your customer experience management and, additionally, your ROI.

Integrate Your CRM, Marketing Automation, and Media Solutions Into a Single Infrastructure

Optimizing customer experience begins with consolidating and analyzing your data. To that end, integrating your CRM and marketing solutions can be an incredibly effective first step. Comprehensive CRM and marketing automation solutions — such as Salesforce, Marketo and HubSpot — almost universally come with third-party integrations out-of-the-box. For more distinct infrastructures, APIs, importing and exporting, or custom programming may be required. Regardless, this will create a single infrastructure that contains all of your customer information.

Not only does this improve analytics, but it also improves customer care overall. Both customer service representatives and sales personnel will have all of the information they need to quickly service the customers and get them the information that they need. Marketing campaigns will be able to target customers based on their prior behaviors — and will be able to prompt them towards purchasing more effectively.

Develop an Omni-Channel Approach through Content Management Systems

Content Management Systems (CMS) make it easier to push content directly to a multitude of different channels. Social media, email marketing, and websites can all be consolidated under a single content system — so that a single push of the button can update customers on a variety of platforms. Omni-channel approaches make it easier to scale your organization upwards and to reach out to individuals across multiple demographics and interests. Through regular content distribution, companies can achieve better organic growth and improve upon their inbound marketing.

A CMS is particularly useful for lead procurement and demand generation. With the use of a CMS, a strong and strategic digital marketing campaign can ensure that leads come to the business rather than the business having to procure leads. Organizations are thus able to improve upon their ROI, extend their marketing reach, and refocus their budget to additional areas of advertising and support.

Explore Big Data, Such as Emotional Analytics and Predictive Intelligence

Emotional analytics and big data can work together to develop new strategies for customer acquisition and retention. Algorithms are now available that are substantially advanced that they can look at patterns of customer behavior and determine the best way to service that customer. At its most complex, emotional analytics can involve motion capture and facial analysis, in order to detect micro-expressions that may aid in detecting the customer’s emotional state. But this isn’t the type of analytics that would most commonly be used by a business. Businesses, instead, would most likely use text-based analysis or verbal analysis, to detect the best leads based on their word usage and the amount of emotive statements they have made.

Not all big data is so complex. Predictive intelligence can also be much simpler, such as looking at a customer’s past purchases and predicting when they will need to make further purchases. Predictive intelligence is used to fantastic effect on many e-commerce marketplaces, to suggest items that may be relevant to the consumer based on the items that they have either purchased or browsed. Predictive intelligence can also be used to detect and identify certain patterns, such as whether a customer may have abandoned a shopping cart due to high shipping charges.

Create Knowledge Management Systems for Superior Customer Service

Customers today often prefer to self-service. A solid customer service experience is, thus, often one in which the customer does not need to contact the organization at all. New help desk and support solutions can be nearly entirely automated, so that customers can get the answers they need out of a knowledge management system. This management system may take the form of a helper website or even a live chat with a bot. When self-service fails, customers prefer a variety of ways to communicate: through email, phone, instant messaging, or even text message.

By providing these additional resources for customers, organizations not only assist the customer in getting what they want, but also reduce their own administrative overhead. The more customer service can be automated, the less time and money the organization has to sink into technical support and customer service personnel.

It’s an exciting time for organizations looking to improve upon their customer experience. Through better customer experience management, companies can fine-tune their operations and ensure that their customers keep coming back.

Find Meaning in Your Data With Elasticsearch

Posted by on March 28, 2017

We’re surrounded by data everywhere we go, and the amount is growing with each action we take. We rely on it regularly, probably a lot more than we even realize or would like to admit. From searching for nearby restaurants while traveling, to reading online product reviews prior to making a purchase, and finding the best route home based on real-time traffic patterns, data helps us make informed decisions every day.

However, all that data on its own is just data. The real value comes when you can find the right, relevant data when it’s needed. Better yet, take it a step further and find meaning in your data, and that’s where the real goldmine is.

Businesses are increasingly turning to search and analytics solutions to derive more value from their data, helping to provide the deep insights necessary to make better business decisions. Some popular use cases include:

  • Intelligent Search Experiences to discover and deliver relevant content
  • Security Analytics to better understand your infrastructure’s security
  • Data Visualization to present your data in a meaningful way
  • Log Analytics to gain deeper operational insight

At Rivet Logic, we realize the importance of data, and see the challenges businesses are facing in trying to make sense of their growing data pools. We’re excited to have partnered with Elastic – the company behind a suite of popular open source projects including ElasticsearchKibanaBeats, and Logstash – to deliver intelligent search and analytics solutions to help our customers get the most value out of their data, allowing them to make actionable improvements to websites for enhanced customer experiences!

A Real-world Use Case

How might this apply in a real-world scenario, you ask?

An example is a global hospitality customer of ours, who has partnered with Rivet Logic to implement three internal facing web properties that enable the company to perform its day to day business operations. With a reach spanning across 110+ countries, these sites are deployed in the cloud on Amazon AWS throughout the US, Europe and Asia Pacific, consisting of many data sources and used across multiple devices.

This customer needed a way to gain deeper insight into these systems — how the sites are being used along with the types of issues encountered to help improve operational efficiencies. Using Elasticsearch and Kibana, this customer is able to gain much better visibility into each site’s utility. Through detailed metrics, combined with the ability to perform aggregations and more intelligent queries, this customer can now gain much deeper insight into their data set through in depth reports and dashboards. In addition, the Elastic Stack solution aggregates all system logs into one place, making it possible to perform complex analysis to provide insightful data to better address operational concerns.

 

NoSQL Design Considerations and Lessons Learned

Posted by on July 29, 2015

At Rivet Logic, we’ve always been big believers and adopters of NoSQL database technologies such as MongoDB. Now, leading organizations worldwide are using these technologies to create data-driven solutions to help them gain valuable insight into their business and customers. However, selecting a new technology can turn into an over engineered process of check boxes and tradeoffs. In a recent webinar, we shared our experiences, thought processes and lessons learned building apps on NoSQL databases.

The Database Debate

The database debate is never ending, where each type of database has its own pros and cons. Amongst the multitude of databases, some of the top technologies we’ve seen out in the marketing include:

  1. MongoDB – Document database
  2. Neo4j – Graph based relationship
  3. Riak – Key value data store
  4. Cassandra – Wide column database

Thinking Non-Relational

When it comes to NoSQL databases, it’s important to think non-relational. With NoSQL databases, there’s no SQL query language or joins. It also doesn’t serve as a drop-in replacement for Relational Databases, as they are two completely different approaches to storing and accessing data.

Another key component to consider is normalized vs. denormalized data. Whereas data is normalized in relational databases, it’s not a necessity or important design consideration for NoSQL databases. In addition, you can’t use the same tools, although that’s improving and technology companies are heavily investing in making their tools integrate with various database technologies. Lastly, you need to understand your data access patterns, and what it looks like from the application level down to the DB.

Expectations

Also keep in mind your expectations and make sure they’re realistic. Whereas the Relational model is over 30 years old, the NoSQL model is much younger at approximately 7 years, and enterprise adoption occurring within the last 5 years. Given the differences in maturity, NoSQL tools aren’t going to have the same level of maturity as those of Relational DB’s.

When evaluating new DB technologies, you need to understand the tradeoffs and what you’re willing to give up – whether it be data consistency, availability, or other features core to the DB – and determine if the benefits outweigh the tradeoffs. And all these DB’s aren’t created equally – they’re built off of different models for data store and access, use different language – which all require a ramp up.

In addition, keep in mind that scale and speed are all relative to your needs. Understanding all of these factors in the front end will help you make the right decision for the near and long term.

Questions to Ask Yourself

If you’re trying to determine if NoSQL would be a good fit for a new application you’re designing, here are some questions to ask yourself:

  1. Will the requirements evolve? Most likely they will, rarely are all requirements provided upfront.
  2. Do I understand the tradeoffs? Understand your must have vs. like to have.
  3. What are the expectations of the data and patterns? Read vs. write, and how you handle analytics (understand operational vs. analytics DB and where the overlap is)
  4. Build vs. Buy behavior? Understand what you’re working with internally and that changing internal culture is a process
  5. Is the ops team on board? When introducing new DB technologies, it’s much easier when the ops team is on board to make sure the tools are properly optimized.

Schema Design Tidbits

Schema is one the most critical things to understand when designing applications for these new databases. Ultimately the data access patterns should drive your design. We’ll use MongoDB and Cassandra as examples as they’re leading NoSQL databases with different models.

When designing your schema for MongoDB, it’s important to balance your app needs, performance and data retrieval. Your schema doesn’t have to be defined day 1, which is a benefit of MongoDB’s flexible schema. MongoDB also contains collections, which are similar to tables in relational DB’s, where documents are stored. However, the collections don’t enforce structure. In addition, you have the option of embedding data within a document, which depending on your use case, could be highly recommended.

Another technology to think about is Cassandra, a wide column database where you model around your queries. By understanding the access patterns, and the types of questions your users are asking the DB, then you can design your schema to be more accurate. You also want to distribute data evenly across nodes. Lastly, you want to minimize partition (groups of rows that share the same key) reads.

Architecture Examples

MongoDB has a primary-secondary architecture, where the secondary would become the primary if it ever failed, resulting in the notion of never having a DB offline. There are also rights, consistency, and durability, with primaries replicating to the secondaries. So in this model, the database is always available, where data is consistent and replicated across nodes, all performed in the backend by MongoDB. In terms of scalability, you’re scaling horizontally, with nodes being added as you go, which introduces a new concept of sharding, involving how data dynamically scales as the app grows.

On the other hand, Cassandra has a ring-based architecture, where data is distributed across nodes, similar to MongoDB’s sharding. There are similar patterns, but implemented differently within technologies. The diagram below illustrates architectural examples of MongoDB and Cassandra. All of these can be distributed globally, with dynamic scalability, the benefit being you can add nodes effortlessly as you grow.

NoSQL Data Solution Examples

Some of the NoSQL solutions we’ve recently built include:

Data Hub (aka 360 view, omni-channel) – A collection of various data sources pooled into a central location (in this case we used MongoDB), where use cases are built around the data. This enables new business units to access data they might not previously have access to, empowering them to build new products, understand how other teams operate, and ultimately lead to new revenue generating opportunities and improved processes across the organization

User Generated Content (UGC) & Analytics – Storing UGC sessions (e.g. blog comments and shares) that need to be stored and analyzed in the backend. A lot of times the Document model makes sense for this type of solution. However, as technologists continue to increase their NoSQL skill sets, there’s going to be an increasing amount of overlap of similar uses cases being built across various NoSQL DB types.

User Data Management – Also known as Profile Management, and storing information about the user, what they recently viewed, products bought, etc. With a Document model, the flexibility really becomes powerful to evolve the application as you can add attributes as you go, without the need to have all requirements defined out of the gate.

Lessons Learned

When talking about successful deployments, some of the lessons learned we’ve noticed include:

  1. Schema design is an ongoing process – From a Data Hub perspective, defining that “golden record” is not always necessary, as long as you define consistent fields that can be applied everywhere.
  2. Optimization is a team effort – It’s not just the developer’s job to optimize the schema, just like it’s not just the Ops team’s job to make sure the DB is always on. NoSQL is going to give you tunability across these, and the best performance and results
  3. Test your shard keys (MongoDB) – If sharding is a new concept for you, make sure you do your homework, understand and validate with someone that knows the DB very well.
  4. Don’t skimp on testing and use production data – Don’t always assume that the outcome is going to be the same in production.
  5. Shared resources will impact performance – Keep in mind if you’re deploying in the cloud that shared resources will impact distributed systems. This is where working with your Ops team will really help and eliminate frustrations.
  6. Understand what tools are available and where they are in maturity – Don’t assume existing tools (reporting, security, monitoring, etc.) will work in the same capacity as with Relational DB’s, and understand the maturity of the integration.
  7. Don’t get lost in the hype – Do your homework.
  8. Enable the “data consumer” – Enable the person that’s going to interact with the DB (e.g. data analyst) to make them comfortable working with the data.
  9. JSON is beautiful

To summarize, education will eliminate hesitation, and don’t get lost in the marketing fluff. Get Ops involved, the earlier and more often you work with your Ops team, the easier and more successful your application and your experience with these technologies will be. Lastly, keep in mind that these are just DB tools, so you’ll still need to build a front end.

Click here to see a recording of the webinar.

Click here for the webinar slides.