Tag: big data

Why 50 Percent of Companies Haven’t Started Using Big Data

Posted by on June 04, 2018

data-analytics-people-laptop

Nearly every organization today understands the importance of big data. But while large volumes of data may be collected, many businesses are finding themselves unable or unwilling to create a complete data strategy. Businesses are proving to be hesitant to fully transition over to a data-driven approach. It may require a shift in company culture and more motivation from the top for businesses to truly commit to new data analysis processes.

What Is Big Data Being Used For?

Studies by Dell revealed that organizations utilizing data were able to grow 50% faster than their peers. But how is this growth being achieved? Big data is being used for marketing efforts, decision making, and process optimization. Through big data, companies are able to make more intelligent decisions faster — and are able to leverage their resources more effectively.

  • Marketing efforts. 41% of organizations using big data are using it to improve their marketing efforts. 37% of companies are optimizing their marketing strategies in general while 37% are focusing on social media campaigning. Big data is an easy way to identify purchasing and engagement patterns in their customers. Through big data, companies can better score leads and identify the customers that are most valuable.
  • Risk-based decision making. 30% of big data strategies are situated around risk-based decision making. Organizations are able to model potential scenarios to identify the highest risk situations — and, thus, make educated decisions. This may not be enough; more data and less intuition could ultimately lead to better decision making.
  • Business process management. Through data-driven business processes, companies are able to reduce their expenses. 49.2% of Fortune 1000 executives reported that they have started big data analysis for business processes and seen value developed through their strategies.

But there’s another statistic that’s important from this Dell study –39% of the organizations that established a big data plan were not certain what the benefits of their big data strategy actually were. And this is why many companies are not using big data.

Why Are Organizations Failing to Turn to Big Data?

Nearly every organization is now collecting data, but most are failing to actually use the data that they collect. Many of these companies are currently satisfied with the data that they are collecting and the fact that this data will be useful, but they haven’t developed a strategy or invested in a platform that they can use to make use of it.

Despite the tremendous advantages of big data, only 49% of large businesses are currently looking towards big data implementation. This is compared to 21% of small businesses and 19% to 26% of mid-sized businesses. Interestingly, businesses with 50 to 249 employees were far more likely to implement a big data plan than businesses with 250 to 999 employees.

Businesses are struggling to transition their decision-making from a more instinctive and gut-driven process to a more objective and data-driven process. It can be difficult for executives to yield power to a purely statistical and analytical process and this can cause some hesitation. Further, businesses may be afraid to invest in big data technology without a clear idea of its benefits.

Every business is different and consequently requires a different data model. There are very few businesses for which a “one size fits all” data strategy would be effective. Because of this, businesses are forced to develop their own big data strategy from scratch. This can also feel overwhelming to businesses, especially those that are not otherwise highly technical.

Businesses may also be frightened to further engage in data-related pursuits as security becomes more of a question. Ransomware, data breaches, and other similar attacks have made many businesses more conscientious about data and data storage. This may slow efforts to invest in new data-related platforms. Businesses that already have a data platform may feel that they aren’t getting the most out of their software and may be discouraged.

Though businesses may not be transitioning towards big data as quickly as they should, most businesses are aware that they need to be considering it. It’s simply a matter of implementation. Businesses that are interested in implementing data-driven solutions but not sure how should consider consulting with a professional services firm to determine a strategy and approach that will leverage data in a way that delivers most business value, through technology that aligns with their business goals.

Top 5 Health IT Challenges for 2018

Posted by on January 31, 2018

healthcare-doctors-meeting

Our mission is to deliver riveting digital experiences for our healthcare clients. A new year always inspires a fresh look, and 2018 will bring a new (as well as continuing) set of challenges for healthcare executives. If you want to know what healthcare leaders are most concerned about, just ask them. Surveyors for Managed Healthcare Executive and the PwC Health Institute did precisely that.

The 2 Surveys Disclosed 5 Challenges

This post summarizes 5 Health IT challenges healthcare executives say are still top of their hit parade.

Challenge #1. Using big data to improve quality and reduce costs continues to lag.

Only 12% of the survey responders reported that their organization is excelling in scooping up and harvesting all the data they generate and can harvest from other sources. While the percentage remains static from the 2016 survey, 46% of the respondents report they have come “a long way” in this area—up from 39% from last year.

Handicapping that progress is that, even though more healthcare data is generated, the information is scattered across multiple sources—patients, providers, and payers. There is no single source for healthcare data. Patients migrate between different health plans or providers, but the data does not follow them.

Most organizations do not have the technology to capitalize on big data. It is everywhere, but it is locked in silos with different formats and, again, from a variety of sources. To get at it, organizations need the big data technology infrastructure to get it, store it, and analyze it at a scale that is useable.

Our take on implications for healthcare clients: New ways to manage big data are growing at an explosive rate. It is all about aligning business goals with the technology. Rivet Logic’s big data solutions leverage the power of MongoDB to get a focused view of opportunities for cost reduction along with increases in productivity.

Challenge #2. Value-based Reimbursement Initiatives are lagging.

Value-based programs reward healthcare providers with incentive payments for the quality of care they provide to Medicare patients.  Organizations continue to struggle in this area because the traditional fee-for-service system does not mesh well to a metrics- and outcome-driven value-based care approach. Also, delivering value-based care requires new infrastructure, workflow, and information sources, which are vastly different from those already in place.

How Rivet Logic can help you to migrate from fee-for-service to value-driven value-based care: Improving the patient experience is at the core of value-based care. Organizations need better collaborative processes and tools and the right mix of tools, which promote transparency and better internal communication.

That communication relies on patient profile management and turning the customer experience into a single data gathering session, which does not have to involve information overlap in data silos.  For a detailed view of that process, download our data sheet to learn more about how address customer identity management.

Challenge #3. Patient experience must be a priority and not just a portal.

Just under half (49 percent) of provider executives reported that one of their top three priorities during the upcoming years will be revamping the patient experience. That effort will require healthcare organizations to “connect data points across and beyond the organization to understand how the patient’s experience fits” into the business.

Again, executives agree that it all centers around bringing in multiple data sets. It requires “governing them, establishing ownership, and utilizing them to provide a real time, actionable information about the patient.”

Connectivity is the key. The patient experience is being transformed by technology. A connected health system requires better engagement of everyone—providers, their employees, and, most importantly, the patient. Digital solutions, like patient portals and mobile applications are supplanting visits to the office. Patients can self-monitor their conditions and transmit diagnostics over their smartphones. For more insight on this challenge and how Rivet Logic can help with that connectivity, download our data sheet to learn more about enabling better care with a connected health system.

Challenge #4.  Securing the Internet of Things.

PwC predicts that there will be more cybersecurity breaches. So, hospitals and health systems need to be educated and prepared. PwC reported that 95 percent of the surveyed executives believed their organization is protected. However, only 36 percent had management access policies in place. Worse yet, only 34 percent could point to a cybersecurity audit process.

Managed Services is one solution. Rivet Logic provides a flexible and scalable array of automated processes, services, and on-demand infrastructure designed to reduce IT costs without sacrificing quality or security.

Challenge #5. Artificial intelligence will be a healthcare coworker.

Healthcare employees function best when automation takes over tiring, labor-intensive tasks. An average of 70 to 80 percent and  Business executives reported that they plan to automate routine paperwork, scheduling, timesheet entry, and accounting with AI tools. In fact, a whopping 75 percent of healthcare executives “plan to invest in AI in the next three years.”

Again, managed services provide the pathway to keeping up with developments in IT in an environment of an expected continuing shortage of healthcare professionals.

Join us March 5-9, 2018 at HIMSS18

Rivet Logic will be exhibiting at HIMSS18 in Las Vegas in the Connected Health Experience pavilion.  Discussions of approaches and solutions to the above-mentioned challenges–and much, much more–will be on the agenda, including:

  • Clinical Informatics & Clinician Engagement
  • Compliance, Risk Management & Program Integrity
  • Data Analytics/Clinical & Business Intelligence
  • EHRS
  • Health Informatics Education, Career Development & Diversity
  • HIT (Health IT) infrastructure & Standards
  • Improving Quality Outcomes Through Health IT
  • Patient Safety & Health IT
  • Privacy, Security & Cybersecurity

The Data Analytics Trends that Will Shape 2018

Posted by on January 10, 2018

2018-data-analytics-trends

As a field, data analytics is only growing. Not only has the industry of data science broadened substantially, but many companies are finding themselves devoting large amounts of resources towards understanding data analytics and trying to identify new trends. This reliance upon data is only going to grow through 2018, as companies are finding that the data that they collected may contain even more useful information than they previously believed.

2018 is going to find many companies making better use of the data that they already have, and fine-tuning their existing data collection and analysis methods.

Better Personalization Metrics

Industry leaders are hard at work creating incredibly detailed profiles of their customers. Companies don’t need to develop this information themselves. Google, for instance, has fine-tuned its customer profiles and made these customer profiles accessible to those using its analytics and advertising services. Social media platforms have been able to capture customer information similarly, from Facebook to LinkedIn.

The result of this is that advertising is likely to become hyper-personalized to each customer. Not only will companies know the demographics of each customer (age, gender, location), but also their buying habits, how much money they make, and which locations they frequent. Businesses will be able to increasingly target customers and anticipate their needs, ideally creating a situation in which advertising becomes more valuable to the customer.

Augmented Reality Systems

Augmented reality has been kicked around for the last decade, held back by issues regarding processing speed and (perhaps more importantly) battery life. Augmented reality feeds digital information about an individual’s location directly to them, often through a visible “heads up” display.

Not only is this going to change the way individuals interact with the world, but it’s also going to change the data collected. How often do users spend looking at a specific product? Which products or locations do they display further interest in? These will all create incredibly valuable data points that will again be used to create a realistic model of what customers want and need.

Streamlined Data Solutions

Companies have built up their data caches. Now they’re looking for streamlined, agile solutions that can help them make use of the data. In the past, companies were satisfied with collecting as much data as possible and then mining them for as many insights as they could find. Now, companies are more focused on fine-tuning their systems, generating and using the minimal amount of data they need for effective results.

This will create a rise in agile data science, whereby companies will be able to quickly create data sets, respond to and modify these data sets, and produce tangible results from their data sets. In this, the emphasis will be less on the data itself and more what the data can do for the company.

The Science of the Customer Journey

Buyer personas have led to further exploration of the customer journey, a science that attempts to identify the stages that customers go through when investigating and making a purchase. Customer journeys are an incredibly effective way to understand customers and their unique needs.

Data science is likely to be integrated into further understanding of the customer journey. What drives a customer to seek a product? How often does a customer generally research a product? What types of research are most effective and most compelling? What makes a customer more or less likely to find a company and engage in a purchase?

Customer journeys are designed to model customer behavior, so that companies are able to more accurately give customers the information and the prompts they need to continue their journey. In the coming year, this will evolve into a science of its own, and marketers will likely be collecting more customer behavior-related data than ever.

Machine Intelligence Continues to Advance

Alongside all of this, machine intelligence and machine learning will continue to advance. Many businesses have large volumes of data, but it is actually identifying patterns within that data that has become difficult. More advanced machine algorithms will be developed to clean usable, actionable insights from the data that is stored. Machine intelligence will increasingly be used for tasks such as scoring leads, identifying keywords, and targeting specific demographics.

More advanced, learning algorithms are being developed that can, within their parameters, work to improve their own functions. With the right data sets and the right code, marketing algorithms will be able to fine-tune themselves and optimize their own performance. This will be especially useful in A/B testing or split testing, as algorithms will be able to test out different marketing functions and determine the optimal configuration on their own.

Small amounts of this are already cropping up in apps and social media platforms, such as the ability of an algorithm to determine what is most likely to get a profile clicked on, or which photos and posts are most engaging. This can be used in a marketing sense to determine not only which products are most attractive to customers, but which photos they prefer to see, and what descriptions they’re most interested in.

For the most part, 2018 is going to see a maturation of data analytics and data science, as companies invest more money into both collecting and understanding their data. But technology itself is going to play a significant role as well, as the technology behind machine learning and AI is becoming more sophisticated and complex. Either way, companies are going to have to invest more in their data if they want to understand their customers and continue to market directly to them.

DXP Series, Part III: DXP and Data-Driven Decision-Making

Posted by on December 06, 2017

Business team meeting analysis financial chart together at cafe.

Think about how you make important business decisions. Decision-making begins at the point where intuition takes over from analyzing the data.  If your data analysis carries far less weight than intuition, your decision process may not be taking full advantage of available data.

If so, you are not alone. Bi-Survey.com surveyed over 720 businesses. The survey found that 58 percent of respondents based about half of their regular business decisions “on gut feel or experience.” On the other hand, over 67 percent of those businesses “highly valued” information for decision-making, and 61 percent considered information “as an asset.”

The survey showed that when businesses were not using information as the basis for decision-making, it was because the information was not available or reliable. They were either not collecting it or were not using what they had.

KPIs are there, but not the data to read them

Another significant finding involved the role of key performance indicators. There is an important connection between KPIs and the data that measure and drive them. Here is where another disconnect stood out like a beacon: Nearly 80 percent of the companies had defined and standard sets of KPIs, but only 36 percent were using them “pervasively across the organization.”

So there was an obvious disconnect between valuing the information and a willingness to use it. In this post we shall address that contradiction and explore ways to close the gap between valuing the data and using it for data-driven decisions.

How DXP leverages data analytics

The road to data-driven decisions must go through data analytics.  In a previous blog, we discussed how data analytics and other tools plug into the realm of DXP. Data analytics are what help you find meaning in the data you generate and collect.

Those meanings are what drive the decisions and strategies that focus on efficiency and excellent customer service. In terms of business decisions, the ones based on verifiable and quality data are the most beneficial to the business. They are data-driven.

So, data-driven decision management is a way to gain advantage over competitors. One MIT study found that companies who stressed data-based decisions achieved productivity and profit increases of 4% and 6%, respectively.

Two “how-tos” to get on the road to data-driven decision management

#1. How to head towards a data-driven business culture (and benefit from it)

The survey showed that respondents were operating at half capacity when it came to using data-driven decision methods. To unlock the process as well as the data, businesses need to do the following:

  • Focus on and improve data quality.
  • Ease and lower the cost of information access. Break down those proprietary silos and use the best data-extraction tools available.
  • Improve the way the organization presents its information. There are many outstanding presentation products on the market.
  • Make the information easier to find, and speed up the process where users can access the information.
  • Get senior management on board and aware of the value of business intelligence and data-based decision making. Promote a culture of collaborative decision-making.

#2. How to improve internal data management

Data governance (where the data comes from, who collects and controls it) is a major obstacle to taking advantage of data-driven decision benefits. Survey recommendations were that companies should take the following steps:

1. Build an IT architecture that is agile and which can integrate the growing number of data sources required for decision-making. Plug into external big-data sources and start harvesting them.

2. Look for ways to break down barriers to promote cross-departmental cooperation and data alignment. A business intelligence competency center (BICC) can play a major role in achieving that goal.

3. Re-define and use KPIs across the organization and align those measures of success with a focus on data governance.

A strategy for applying data-based decision-making

Bernard Marr in his Forbes online piece, provides the following suggestions for any business to for applying data to decision making:

1. Start simply.

To overcome the overload of big data and its endless possibilities, design a simplified strategy. Cut to what your business is looking to achieve.  Rather than starting with the data you need, start with what your business goals are.

2. Focus on the important.

Concentrate on the business areas that are most important to achieving the foregoing strategy. “For most businesses,” says Marr, “the customer, finance, and operations areas are key ones to look at.”

3. Identify the unanswered questions.

Determine which questions you need to answer to achieve the above focus. Marr points out that when you move from “collect everything just in case” to “collect and measure x and y to answer question z,” you can massively reduce your cost and stress levels.

4. Zero in on the data that is best for you.

Find the ideal data for you: the data that will answer the most important questions and fulfill your strategic objectives. Marr stresses that no type of data is more valuable or inherently better than any other type.

5. Take a look at the data you already have.

Your internal data is everything your business currently has or can access. You are probably sitting on much of the information you know you need. If the data has not been collected, put a data collection system in place or go for external resources.

6. Make sure the costs and effort are justified.

Marr suggests treating data like any other business investment. To justify the cost and effort, you need to demonstrate that the data has value to your long-term business strategy. It is crucial to focus only on the data you need. If the costs outweigh the benefits, look for alternative data sources.

7. Set up the processes and put the people in place to gather and collect the data.

You may be subscribing to or buying access to a data set that is ready to analyze, in which case your data collection efforts are easier. However, most data projects require some data collection to get them moving.

8. Analyze the data to get meaningful and useful business insights.

To extract those insights, you need to plug into the analytics platforms that show you something new. Look for platforms that squeeze out the reports, analysis, and switchboard displays that tell you what you need to know.

9. Show your insights to the right people at the right time.

Do your data presentation in a way that overcomes the size and sophistication of the data set. The insights you present must inform decision-making and improve business performance. Go for style, and substance will take care of itself.

10. Incorporate what you learned from the data into the business.

Here is where you turn data into action. When you apply the insights to decision making, you transform your business for the better. That is the crux if data-driven decision-making. It is also the most rewarding part of the venture.

Summary and Conclusions

1. Business decision-making based on data results in greater reliability, efficiency, and profitability. DXP leverages data analytics towards the goal of more data-based decision making and achieving a competitive advantage.

2. Migrating towards a data-driven business culture requires unlocking the 50 percent of the decision-making and data currently not being used. It requires improved internal data management and governance and breaking down barriers to internal communication.

3. Finally, when those barriers are down, you can begin a strategy for applying data-based decision making. Start simple and focus on what business areas you need to improve and determine what data you need. No data is better or more valuable than any other; the key is to find the data that meets your objective, analyze it, and translate it into actionable decisions and improvement.

Machine Learning is State-of-the-Art AI, and It Can Help Enhance the Customer Experience

Posted by on October 05, 2017

connecting-people

Is artificial intelligence the same as machine learning? Machine learning is really a subset of artificial intelligence, and a more precise way to view it is that it is state-of-the-art AI. Machine learning is a “current application of AI” and is centered around the notion that “we should…give machines access to data and let them learn for themselves.” There is no limit to that data (or Big Data).  The challenge is harnessing it for useful purposes.

In his Forbes Magazine piece, contributor Bernard Marr, describes AI as the “broader concept of machines being able to carry out tasks in a way we would consider ‘smart.’” So, AI is any technique that allows computers to imitate human intelligence through logic, “if-then” rules, decisions trees and its crucial component, machine learning.  Machine learning, as an application of AI, employs abstruse (i.e., difficult to understand) statistical techniques, which improve machine performance through exposure to Big Data.

AI has broad applications…

Companies around the world use AI in information technology, marketing, finance and accounting, and customer service. According to  a Harvard Business Review article, IT garners the lion’s share of popularity in AI activities, ranging in applications that detect and deflect security intrusions, to automating production management work. Beyond security and industry, AI has broad applications in improving customer experiences with automatic ticketing, voice- and face-activated chat bots, and much more.

Machine learning is data analysis on steroids…

AI’s subset, machine learning, automates its own model building. Programmers design and use algorithms that are iterative, in that the models learn by repeated exposure to data. As the models encounter new data, they adapt and learn from previous computations. The repeatable decisions and results are based on experience, and the learning grows exponentially.

The return of machine learning

Having experienced somewhat of a slump in popularity, AI and machine learning have, according to one software industry commentator, Jnan Dash, seen “a sudden rise” in their deployment. Dash points to an acceleration in AI/machine learning technology and a market value jump “from $8B this year to $47B by 2020.”

Machine learning, according to one Baidu scientist will be the “new electricity,” which will transform technology. In other words, AI and machine learning will be to our future economy what electricity was to 20th century industry.

The big players are pushing AI and machine learning. Apple, Google, IBM, Microsoft and social media giants Facebook and Twitter are accelerating promoting machine learning. One Google spokesman, for example, recognizes machine learning as “a core transformative way in which we are rethinking everything we are doing.”

How Machine learning has transformed General Electric…

A striking example of how AI and machine learning are transforming one of the oldest American industries, General Electric, is highlighted in this Forbes piece. Fueled by the power of Big Data, GE has leveraged AI and machine learning in a remarkable—and ongoing—migration from an industrial, consumer products, and financial services firm “to a ‘digital industrial’ company” focusing on the “Industrial Internet.” As a result, GE realized $7 billion in software sales in 2016.

GE cashed in on data analytics and AI “to make sense of the massive volumes of Big Data” captured by its own industrial devices.  Their insights on how the “Internet of Things” and machine connectivity were only the first steps in digital transformation led them to the realization that “making machines smarter required embedding AI into the machines and devices.”

After acquiring the necessary start-up expertise, GE figured out the best ways to collect all that data, analyze it, and generate the insights to make equipment run more efficiently. That, in turn, optimized every operation from supply chain to consumers.

5 ways machine learning can also enhance the customer experience…

Machine learning can integrate business data to achieve big savings and efficiency to enhance customer experiences, by:

  1. Reading a customer’s note and figure out whether the note is a complaint or a compliment
  2. Aggregating the customer’s personal and census information to predict buying preferences
  3. Evaluating a customer’s online shopping history or social media participation and place a new product offering in an email, webpage visit, or social media activity
  4. Intuitively segmenting customers through mass customer data gathering, grouping, and targeting ads for each customer segment
  5. Automating customer contact points with voice- or face-activated “chat bots”

How Rivet Logic can make you future-ready and customer friendly

Your business may be nowhere near the size of General Electric. You do, however, have a level playing field when it comes to leveraging Big Data and machine learning products to a winning strategy. What we do is help you plan that strategy by:

  • Aligning your business goals with technology—What are the sources of your own data and how can they harness the power of NoSQL databases, for example?
  • Designing your user experience—What do you need? A custom user interface, or a mobile app with intuitively simple user interfaces?

We can do that and much more. Contact us and we’ll help make your business future-ready to collect, harvest, and leverage all the great things you are doing now.

Optimizing Your Customer Experience Management

Posted by on August 15, 2017

concert-768722_1920

A customer’s experience with your organization may, in fact, be more important than the quality of either your products or your services. Customers today want to feel valued — they want to be able to have their needs both anticipated and fulfilled. Improving upon and optimizing your customer’s experiences is called customer experience management. Through new technologies, there are many ways that you can improve upon your customer experience management and, additionally, your ROI.

Integrate Your CRM, Marketing Automation, and Media Solutions Into a Single Infrastructure

Optimizing customer experience begins with consolidating and analyzing your data. To that end, integrating your CRM and marketing solutions can be an incredibly effective first step. Comprehensive CRM and marketing automation solutions — such as Salesforce, Marketo and HubSpot — almost universally come with third-party integrations out-of-the-box. For more distinct infrastructures, APIs, importing and exporting, or custom programming may be required. Regardless, this will create a single infrastructure that contains all of your customer information.

Not only does this improve analytics, but it also improves customer care overall. Both customer service representatives and sales personnel will have all of the information they need to quickly service the customers and get them the information that they need. Marketing campaigns will be able to target customers based on their prior behaviors — and will be able to prompt them towards purchasing more effectively.

Develop an Omni-Channel Approach through Content Management Systems

Content Management Systems (CMS) make it easier to push content directly to a multitude of different channels. Social media, email marketing, and websites can all be consolidated under a single content system — so that a single push of the button can update customers on a variety of platforms. Omni-channel approaches make it easier to scale your organization upwards and to reach out to individuals across multiple demographics and interests. Through regular content distribution, companies can achieve better organic growth and improve upon their inbound marketing.

A CMS is particularly useful for lead procurement and demand generation. With the use of a CMS, a strong and strategic digital marketing campaign can ensure that leads come to the business rather than the business having to procure leads. Organizations are thus able to improve upon their ROI, extend their marketing reach, and refocus their budget to additional areas of advertising and support.

Explore Big Data, Such as Emotional Analytics and Predictive Intelligence

Emotional analytics and big data can work together to develop new strategies for customer acquisition and retention. Algorithms are now available that are substantially advanced that they can look at patterns of customer behavior and determine the best way to service that customer. At its most complex, emotional analytics can involve motion capture and facial analysis, in order to detect micro-expressions that may aid in detecting the customer’s emotional state. But this isn’t the type of analytics that would most commonly be used by a business. Businesses, instead, would most likely use text-based analysis or verbal analysis, to detect the best leads based on their word usage and the amount of emotive statements they have made.

Not all big data is so complex. Predictive intelligence can also be much simpler, such as looking at a customer’s past purchases and predicting when they will need to make further purchases. Predictive intelligence is used to fantastic effect on many e-commerce marketplaces, to suggest items that may be relevant to the consumer based on the items that they have either purchased or browsed. Predictive intelligence can also be used to detect and identify certain patterns, such as whether a customer may have abandoned a shopping cart due to high shipping charges.

Create Knowledge Management Systems for Superior Customer Service

Customers today often prefer to self-service. A solid customer service experience is, thus, often one in which the customer does not need to contact the organization at all. New help desk and support solutions can be nearly entirely automated, so that customers can get the answers they need out of a knowledge management system. This management system may take the form of a helper website or even a live chat with a bot. When self-service fails, customers prefer a variety of ways to communicate: through email, phone, instant messaging, or even text message.

By providing these additional resources for customers, organizations not only assist the customer in getting what they want, but also reduce their own administrative overhead. The more customer service can be automated, the less time and money the organization has to sink into technical support and customer service personnel.

It’s an exciting time for organizations looking to improve upon their customer experience. Through better customer experience management, companies can fine-tune their operations and ensure that their customers keep coming back.

Find Meaning in Your Data With Elasticsearch

Posted by on March 28, 2017

We’re surrounded by data everywhere we go, and the amount is growing with each action we take. We rely on it regularly, probably a lot more than we even realize or would like to admit. From searching for nearby restaurants while traveling, to reading online product reviews prior to making a purchase, and finding the best route home based on real-time traffic patterns, data helps us make informed decisions every day.

However, all that data on its own is just data. The real value comes when you can find the right, relevant data when it’s needed. Better yet, take it a step further and find meaning in your data, and that’s where the real goldmine is.

Businesses are increasingly turning to search and analytics solutions to derive more value from their data, helping to provide the deep insights necessary to make better business decisions. Some popular use cases include:

  • Intelligent Search Experiences to discover and deliver relevant content
  • Security Analytics to better understand your infrastructure’s security
  • Data Visualization to present your data in a meaningful way
  • Log Analytics to gain deeper operational insight

At Rivet Logic, we realize the importance of data, and see the challenges businesses are facing in trying to make sense of their growing data pools. We’re excited to have partnered with Elastic – the company behind a suite of popular open source projects including ElasticsearchKibanaBeats, and Logstash – to deliver intelligent search and analytics solutions to help our customers get the most value out of their data, allowing them to make actionable improvements to websites for enhanced customer experiences!

A Real-world Use Case

How might this apply in a real-world scenario, you ask?

An example is a global hospitality customer of ours, who has partnered with Rivet Logic to implement three internal facing web properties that enable the company to perform its day to day business operations. With a reach spanning across 110+ countries, these sites are deployed in the cloud on Amazon AWS throughout the US, Europe and Asia Pacific, consisting of many data sources and used across multiple devices.

This customer needed a way to gain deeper insight into these systems — how the sites are being used along with the types of issues encountered to help improve operational efficiencies. Using Elasticsearch and Kibana, this customer is able to gain much better visibility into each site’s utility. Through detailed metrics, combined with the ability to perform aggregations and more intelligent queries, this customer can now gain much deeper insight into their data set through in depth reports and dashboards. In addition, the Elastic Stack solution aggregates all system logs into one place, making it possible to perform complex analysis to provide insightful data to better address operational concerns.

 

NoSQL Design Considerations and Lessons Learned

Posted by on July 29, 2015

At Rivet Logic, we’ve always been big believers and adopters of NoSQL database technologies such as MongoDB. Now, leading organizations worldwide are using these technologies to create data-driven solutions to help them gain valuable insight into their business and customers. However, selecting a new technology can turn into an over engineered process of check boxes and tradeoffs. In a recent webinar, we shared our experiences, thought processes and lessons learned building apps on NoSQL databases.

The Database Debate

The database debate is never ending, where each type of database has its own pros and cons. Amongst the multitude of databases, some of the top technologies we’ve seen out in the marketing include:

  1. MongoDB – Document database
  2. Neo4j – Graph based relationship
  3. Riak – Key value data store
  4. Cassandra – Wide column database

Thinking Non-Relational

When it comes to NoSQL databases, it’s important to think non-relational. With NoSQL databases, there’s no SQL query language or joins. It also doesn’t serve as a drop-in replacement for Relational Databases, as they are two completely different approaches to storing and accessing data.

Another key component to consider is normalized vs. denormalized data. Whereas data is normalized in relational databases, it’s not a necessity or important design consideration for NoSQL databases. In addition, you can’t use the same tools, although that’s improving and technology companies are heavily investing in making their tools integrate with various database technologies. Lastly, you need to understand your data access patterns, and what it looks like from the application level down to the DB.

Expectations

Also keep in mind your expectations and make sure they’re realistic. Whereas the Relational model is over 30 years old, the NoSQL model is much younger at approximately 7 years, and enterprise adoption occurring within the last 5 years. Given the differences in maturity, NoSQL tools aren’t going to have the same level of maturity as those of Relational DB’s.

When evaluating new DB technologies, you need to understand the tradeoffs and what you’re willing to give up – whether it be data consistency, availability, or other features core to the DB – and determine if the benefits outweigh the tradeoffs. And all these DB’s aren’t created equally – they’re built off of different models for data store and access, use different language – which all require a ramp up.

In addition, keep in mind that scale and speed are all relative to your needs. Understanding all of these factors in the front end will help you make the right decision for the near and long term.

Questions to Ask Yourself

If you’re trying to determine if NoSQL would be a good fit for a new application you’re designing, here are some questions to ask yourself:

  1. Will the requirements evolve? Most likely they will, rarely are all requirements provided upfront.
  2. Do I understand the tradeoffs? Understand your must have vs. like to have.
  3. What are the expectations of the data and patterns? Read vs. write, and how you handle analytics (understand operational vs. analytics DB and where the overlap is)
  4. Build vs. Buy behavior? Understand what you’re working with internally and that changing internal culture is a process
  5. Is the ops team on board? When introducing new DB technologies, it’s much easier when the ops team is on board to make sure the tools are properly optimized.

Schema Design Tidbits

Schema is one the most critical things to understand when designing applications for these new databases. Ultimately the data access patterns should drive your design. We’ll use MongoDB and Cassandra as examples as they’re leading NoSQL databases with different models.

When designing your schema for MongoDB, it’s important to balance your app needs, performance and data retrieval. Your schema doesn’t have to be defined day 1, which is a benefit of MongoDB’s flexible schema. MongoDB also contains collections, which are similar to tables in relational DB’s, where documents are stored. However, the collections don’t enforce structure. In addition, you have the option of embedding data within a document, which depending on your use case, could be highly recommended.

Another technology to think about is Cassandra, a wide column database where you model around your queries. By understanding the access patterns, and the types of questions your users are asking the DB, then you can design your schema to be more accurate. You also want to distribute data evenly across nodes. Lastly, you want to minimize partition (groups of rows that share the same key) reads.

Architecture Examples

MongoDB has a primary-secondary architecture, where the secondary would become the primary if it ever failed, resulting in the notion of never having a DB offline. There are also rights, consistency, and durability, with primaries replicating to the secondaries. So in this model, the database is always available, where data is consistent and replicated across nodes, all performed in the backend by MongoDB. In terms of scalability, you’re scaling horizontally, with nodes being added as you go, which introduces a new concept of sharding, involving how data dynamically scales as the app grows.

On the other hand, Cassandra has a ring-based architecture, where data is distributed across nodes, similar to MongoDB’s sharding. There are similar patterns, but implemented differently within technologies. The diagram below illustrates architectural examples of MongoDB and Cassandra. All of these can be distributed globally, with dynamic scalability, the benefit being you can add nodes effortlessly as you grow.

NoSQL Data Solution Examples

Some of the NoSQL solutions we’ve recently built include:

Data Hub (aka 360 view, omni-channel) – A collection of various data sources pooled into a central location (in this case we used MongoDB), where use cases are built around the data. This enables new business units to access data they might not previously have access to, empowering them to build new products, understand how other teams operate, and ultimately lead to new revenue generating opportunities and improved processes across the organization

User Generated Content (UGC) & Analytics – Storing UGC sessions (e.g. blog comments and shares) that need to be stored and analyzed in the backend. A lot of times the Document model makes sense for this type of solution. However, as technologists continue to increase their NoSQL skill sets, there’s going to be an increasing amount of overlap of similar uses cases being built across various NoSQL DB types.

User Data Management – Also known as Profile Management, and storing information about the user, what they recently viewed, products bought, etc. With a Document model, the flexibility really becomes powerful to evolve the application as you can add attributes as you go, without the need to have all requirements defined out of the gate.

Lessons Learned

When talking about successful deployments, some of the lessons learned we’ve noticed include:

  1. Schema design is an ongoing process – From a Data Hub perspective, defining that “golden record” is not always necessary, as long as you define consistent fields that can be applied everywhere.
  2. Optimization is a team effort – It’s not just the developer’s job to optimize the schema, just like it’s not just the Ops team’s job to make sure the DB is always on. NoSQL is going to give you tunability across these, and the best performance and results
  3. Test your shard keys (MongoDB) – If sharding is a new concept for you, make sure you do your homework, understand and validate with someone that knows the DB very well.
  4. Don’t skimp on testing and use production data – Don’t always assume that the outcome is going to be the same in production.
  5. Shared resources will impact performance – Keep in mind if you’re deploying in the cloud that shared resources will impact distributed systems. This is where working with your Ops team will really help and eliminate frustrations.
  6. Understand what tools are available and where they are in maturity – Don’t assume existing tools (reporting, security, monitoring, etc.) will work in the same capacity as with Relational DB’s, and understand the maturity of the integration.
  7. Don’t get lost in the hype – Do your homework.
  8. Enable the “data consumer” – Enable the person that’s going to interact with the DB (e.g. data analyst) to make them comfortable working with the data.
  9. JSON is beautiful

To summarize, education will eliminate hesitation, and don’t get lost in the marketing fluff. Get Ops involved, the earlier and more often you work with your Ops team, the easier and more successful your application and your experience with these technologies will be. Lastly, keep in mind that these are just DB tools, so you’ll still need to build a front end.

Click here to see a recording of the webinar.

Click here for the webinar slides.

Building Engaging Customer Experiences Powered by MongoDB

Posted by on July 08, 2014

This spring and summer, the MongoDB Road Show stops in over 20 cities across the country to educate users on how MongoDB can be used to build modern business apps to improve the customer experience and accelerate time to market. Rivet Logic sponsored several of the cities and presented on the topic of building engaging customer experiences with MongoDB, discussing how a modern database can be used to better leverage existing data to derive business value. The next MongoDB Road Show is this Thursday, July 10, in San Francisco!

What Organizations Need

Organizations seeking to build engaging customer experiences on the Web often have a similar set of goals. To start, they want to increase user adoption by providing an engaging experience that brings value to the end-user. This can lead to increased customer retention, allowing organizations to create loyal customers who can then become their own brand ambassadors.

Moreover, organizations want to capitalize on their customers’ and users’ creativity and innovation by seamlessly weaving in the ability to collaborate, interact, and share into every aspect of the user experience. Businesses find the quality of this type of engagement to be particularly beneficial, due to its unpredictability. However, to enhance the value of these interactions, users need a motivator, meaning organizations need to create high quality content that’s personalized and targeted to each user’s needs.

While personas are great and have worked well to capture general types of users, in reality, users think of themselves more as individuals, with evolving interests over time. Organizations are now faced with delivering personalized experiences beyond a persona level and at an individual level.

What’s the Problem?

However, many organizations are having a hard time with this fine-grained personalization, and it’s largely due to the limiting technology they’re working with. IT teams are often faced with seemingly “impractical” features that business teams are requesting.

Organizations today are using separate systems like standalone content apps – blogs, forums, wikis, – commenting engines, traditional databases, and BI tools to enable user interaction and collect and analyze information about them. The quality of user interactions is largely driven by the quality of the user-generated content being collected and analyzed. However, since much of this valuable customer data is silo’d in disparate systems, it’s not allowing businesses to effectively leverage their existing data.

While many have attempted to find workarounds for this, there hasn’t been any real success in creating a coherent rich user interaction data set that brings value to all the delivery use cases available. For example, when a user joins the comment thread of a blog entry, they are unaware of the possibility of a forum thread that is discussing the same topic. In addition, these solutions are typically backed by traditional databases, which requires changing of the infrastructure to accommodate new use cases, posing a challenge.

The fact is, the various types of interactions that exist today are disjointed, resulting in redundancy and little chance of connecting and leveraging them. It’s critical that we make these interactions context-aware, and the only way of effectively doing so is to have a holistic view of all the user-generated content that’s being collected, while also allowing the interactions to cross application boundaries.

Pillars of a Good Solution

Successful solutions that meet these challenges must adhere to the following pillars:

Flexibility – The solution must be implemented using technology agnostic building blocks. Being a certain type of shop (.Net, PHP, Windows, etc.) constrains organizations from using the right tools for the job. Using technology agnostic building blocks as the underlying infrastructure allows organizations to innovate and improve their business without being held back by technology.

Scalability – The solution must be scalable without sacrificing performance. There are many platforms out there that claim to be scalable, but what good is that when scaling means long page load times?

Visibility – It’s also extremely important to be able to know and see the overall picture and have a holistic view of user interactions that isn’t so low-level where it prevents you from seeing what they are doing (as is the case with auditing services).

Insight – Lastly, when you have rich, contextual data available in one place, organizations need to be able to leverage that information, innovate and provide new features, capability, and value to their customers.

Case Study – AT&T Developer Community

Now let’s take a look at how a solution like this might be used in the real world. AT&T is currently undergoing an initiative to build a solution to enhance the user experience of their developer community site. The existing site’s collaboration tools are traditional in nature (i.e. blogs and forums), where user engagement is fragmented, making it difficult to find interesting content and reducing collaboration value.

To resolve this, Rivet Logic is implementing a solution that enables user-generated content to cross application boundaries and reside in one location via Crafter Social, while also allowing for better personalization by using Crafter Profile to maintain a dynamic customer profile.

Crafter Social easily adds social engagement features – user-generated comments, likes, ratings, blogs, discussion forums, and more – to a website by attaching social features to any content item or page. And Crafter Profile provides user profile and account management to help create personalized experiences.

For example, in the current site, if a user comments on a blog entry and another user participates in a forum discussion about the same topic, these interactions are not associated in any way.

With the Crafter Social solution, we were able to turn the blog entry’s comment thread into a virtual forum, thus connecting the two threads of discussion into one. This simple approach is extremely powerful, satisfying all four pillars of a good solution focused on enhancing customer engagement.

Even more, due to Crafter Social’s flexible architecture and underlying data model, it can easily be extended into other use cases, made possible by MongoDB’s document-based data models. In addition, the ability to easily embed Crafter Social into any site using any technology makes it an ideal part of any developer’s toolkit.

As illustrated in the diagram below, Crafter Social is broken into two parts. On the client side, it can be embedded on any site page regardless of what technology was used for implementation. And on the server side, Crafter Social collects various data from different sites and use cases, maintaining a holistic view of the user data. All of this helps enhance the quality of business intelligence information generated.

With this solution, AT&T is able to achieve their goals of increased user adoption and enhanced user engagement and retention. MongoDB plays a key role in the solution’s success by enabling:

  • Flexibility – Create new apps without revisiting infrastructure
  • Scalability – Ability to store large amounts of data and query without hurting performance
  • Visibility – Data is structured in an intuitive way allowing easy translation from raw data to something actionable
  • Insight – Flexible data structures and queries pave the way for creativity and innovation

To download a copy of the slides, click here.

What’s in Store for Digital Experience Management in 2014

Posted by on January 13, 2014

2014 is here in full swing, and promises to be an exciting year as the web continues to evolve and new products and trends continue to disrupt the industry.

In 2013, we saw the continued rise of mobile and the age of the customer, where enterprises worldwide re-evaluated strategies to optimally engage with their customers in this digital era. As mobile devices continue to proliferate and have become the new norm, consumers have increasingly higher expectations of the right content delivered to them when and how they want it, resulting in customer experience management skyrocketing to the top of every organization’s priority list.

However, experience management doesn’t just apply to customers, but instead extends to include all organizational stakeholders – customers, employees, partners, etc. We also saw organizations take a closer look internally and focus their efforts on employee community building. Realizing that workplace environments are changing, with a greater need for enterprise collaboration now than ever before, companies are implementing social intranet solutions that offer dynamic and social environments to facilitate community and collaboration.

In addition, many organizations are also building web-based social communities for their external stakeholders for further engagement to improve customer relations and build brand loyalty.

Tackling these daunting digital experience management tasks require careful planning and execution. Organizations need to first determine their business strategies and goals and take the time to really understand their audience to formulate the right messaging. A well thought out strategy sets the right foundation to build your systems – customer experience management, social intranet, customer portal, etc. – upon. The technology should be an enabler of your goals and facilitate your business users to effectively carry out your business objectives.

At Rivet Logic, we believe that software should be agile systems that can easily be customized to fit each organization’s unique needs. There’s no one-size-fits-all tool, and your underlying system must be flexible and developer friendly to allow various customizations and integrations with other existing enterprise applications. In addition, your system must be user friendly for business users. As we’ve seen over the past few years, there’s been a shift from IT to Marketing as Marketing’s responsibilities have expanded to include multi-channel web content management, customer experience management, and more. The tools we employ must be easy to use for non-technical business users.

In 2014 we’ll continue to see these trends evolve. Organizations will continue to put a large emphasis on customer experience management and creating a seamless omni-channel experience as mobile continues to grow. Businesses will also focus more on big data. The explosive growth of social media and mobile devices has generated an enormous amount of user behavioral data that can be harnessed to provide organizations with valuable insight on how to better address the needs of their customers and employees.