Our talented Senior Consultant, Brent Johnson, made the following interactive infographic. It chronicles the college football rivalry between his alma mater, The University of Miami, and Florida State. Any questions regarding this visualization can be sent directly to email@example.com
Article written by Jun Lee, Senior Consultant at Analytica Consulting
In 2017, data analytics has taken huge strides in innovation. Cloud analytics, in-memory data computing, big data storage, enhanced data encryption – these are all the available tools, should you decide to get involved. However, even with all these tools, many companies continue to spend exorbitant amounts to maintain their traditional methods of collecting and analyzing their data. What is the issue?
Priority. Analytics is pervasive across every operation in a business. No longer should analytics be solely handled by the business or data analysis teams. Instead, analytics should take a more active role in each division within a company. Each team (whether customer services, engineering teams, business & finance) can contribute to the efficiency of the team performance, can link and derive errors, and predict modular solutions by using analytics. Analytics is a priority at each level of the company, and should be integrated into both distributed and local work.
Innovation. Having the latest tech and infrastructure is only a tool to reach your end goal: analyses that drive a business successfully. Adopting the proper analytics infrastructure is more than setting up the right technology, it’s creating a learning culture that evolves data sets and analytics applications. People who are data driven and embrace the many solutions Business Intelligence (BI) technology offers, are a necessity for any company looking to create an effective analytics environment. The company needs the mindset to create a cohesive ecosystem, with exchanges of data and structured ideas that build a package of knowledge. Only with this mindset can we fully capture and realize the impact of analytics.
How do we get there?
The road to efficient analytics varies case by case. Here are a couple of ways to launch innovative analytics:
- Receive feedback from your teams and decide how to centralize the management of your data.
- Create granular levels of operation for analytics (Trainings, consistent meetings and presentation sessions, organized request and task structures, scoping of high level analytics projects).
- Present a model example of a data visualization and explain how the data leads to new business insights. Preferably, each team should receive a model data visualization to see how they can utilize analytics. Focusing on an outcome can inspire others to try innovative methods and open minds to create insightful analytics.
- Recognize the tools and experience your company needs. Often times, innovative analytics and ideas are spurred by the lack of proper tools for data insights. Analytics projects are decided based on the traditional methods and applications available to them. However, many available BI tools excite innovation, improve efficiency of the extraction, transformation, and loading (ETL) process, create appealing data visualizations, and reduce the time to make core and detailed analyses. Provide your company with a robust BI tool and trainings to break past limits.
Let’s talk about the financial details:
|Initial Upfront Costs||Recurring Costs|
|Server set up||Security and data governance|
|BI tools purchase (Tableau, Qlik, Splunk, etc.)||BI tools subscription|
|Training sessions and development strategy||Analytics development|
The risks of staying behind your competition, losing potential insights on your data, and lacking efficiency in data analysis far outweigh the costs to correctly set up an effective analytics environment. If you’d like to take your analytics to the next level, please consider Analytica Consulting. Feel free to contact us at: 858-272-8260 or firstname.lastname@example.org.
Article written by Josh Karpen, Senior Consultant at Analytica Consulting
Data science, predictive analytics, machine learning – these methodologies are increasingly becoming a necessary part of the toolkit for modern organizations to compete effectively. Many companies are building data science teams to meet that challenge rather than having one data scientist. One of the reasons they are opting for a team effort is that the mythical data scientist “unicorn” is simply too difficult to find or too expensive when they are found. Another reason is that most data scientists simply cannot handle the workload alone as more departments start making requests for their project time.
Data scientists may not be cheap to employ, but the software most of them are using now is usually free and open source. For example, free tools like R and Python are constantly being updated and new features are added in the form of packages that extend their functionality. If, however, you have an entire team trying to use these tools and not just one individual, there is a downside. The constant software updates and the flexibility with these tools make it difficult to keep projects organized across multiple team members.
When an organization finally puts together a data science team, often times they discover that the individuals on the team are speaking a different language. Then, when the team starts working on a project, there is no cohesion which results in longer project completion time and scattered results.
Here are some other issues organizations can run into when a data science team uses various tools and not one platform:
- Ensuring everyone is on the same version of their chosen software and is using the same version for all packages. If a team member installs a new package, everyone else must install it as well to run their code. How do you implement this and who makes sure this happens? This could be added work if not executed properly.
- Version control is not built in to these tools. Data and code files often need to be sent around to the various team members, which can lead to data duplication or mistakes if the wrong file is used.
- Most tools do not provide a good way to keep the various runs of a model organized and properly labeled. A typical data scientist will find their project directory overflowing with multiple copies of the same code, slightly tweaked as they tried different parameters. Or, they may have runs that are built into a single code file that go on forever, with maybe some commented lines separating them. The commenting line may be done haphazardly depending on the programmer.
- When a new project is started, there is not an easy way to search through past results to see what has been done before. This can lead to the same preliminary steps and data cleaning being repeated multiple times, a waste of time and money.
- Data scientists are not software engineers, so they may not be the best person to implement their models into an actual production environment. This is not always ideal in a team scenario.
The Benefits of a Data Science Platform for your Team
Some companies have chosen Github, which can help with some of these problems, but it was designed primarily for software engineers, not data scientists – and to be honest, there is a bit of a learning curve to make use of Git properly.
However, there are other options and the relatively new concept of a data science platform can revolutionize the way your team works together. What should a useful data science platform do?
- Make it easy for the entire team to work on the same version of every software and package they utilize.
- Allow version control, so Team Member A can create a branch off of Team Member B’s model and test out some ideas without breaking the original version.
- Bring all of your data scientist’s code into one, searchable repository.
- Organize and label the results of every model in a manner that allows for sorting and searching.
- Make it easier to funnel the results of the optimal model into your real-world, production environment, whatever that might be.
- Create great documentation – the days of trudging through badly written explanations and walls of text should be far behind us.
- Generate a well-designed user interface that is easy to navigate through.
There are a number of data science platforms on the market that have all or most of these capabilities. For example, yhat, Domino Data Labs and RapidMiner are a few of the strongest at the moment, but there are many others. All of them allow you to view a demo online of the tool in action, and most offer a free trial as well.
We will be reviewing some data science platforms over the next few months. Keep an eye out for our first review!
What is your opinion – are data science platforms a trend or a necessity? Feel free to share your comments on our LinkedIn page. Is there a particular platform that you are interested in and would like us to review? If so, email us at: email@example.com and we will consider your request.
Electronic Medical Record (EMR) systems have proven their worth within the healthcare system by improving patient care, reducing costs, and augmenting patient safety. These systems contain a wealth of information that is currently under-utilized and not analyzed effectively. With the correct analytics technology, a great deal of insight can be garnered from such data to help improve the operations of the healthcare provider.
Within the last several years, many EMR vendors have caught wind of this and have upgraded their core product with additional data analytic capabilities. In theory, this sounds like an excellent combination. However, in practice, these solutions present many challenges.
The following are challenges met by many EMR systems:
- Numerous EMR vendors haven chosen to build their own add-on analytic software, rather than utilizing an established commercially available platform. This proves to be a difficult undertaking for the vendor, steering them off course from their core offering and producing a rudimentary analytics solution subpar to those currently available.
- Data integration with other external systems such as financial and operational is complex and difficult to achieve by built-in EMR analytics. This prohibits the healthcare provider from gaining insight across domains thereby eliminating the possibility of comprehensive reporting or analysis.
- Data visualization is often not user friendly and designed more for a data analyst rather than a healthcare provider or business user. Proper visualization of EMR data requires domain expertise in the healthcare space and a high degree of design knowledge to present data simply and effectively. EMR vendors may often have expertise in one, but not both, of these arenas.
- Healthcare systems often require customization of analytics software in order to meet specific requirements; whether it be to analyze their patient data or achieve clinical metrics. Many current built-in EMR analytics systems are unable to achieve this degree of customization, often leading to erroneous results which could potentially impact the healthcare system’s financial and patient care outcomes.
These challenges often force healthcare systems to develop their own analytics capabilities, a daunting feat which requires time and resources often not available. What is truly necessary is an analytics software provider that not only knows healthcare well, but is also able to utilize the best-in-breed analytics software. This, coupled with the domain expertise within the healthcare space, will lead to the creation of an integrated solution that meets the requirements of the healthcare provider.
“App,” a word previously only recognized by individuals with a computer degree, became universally known after Apple released the App Store for its iPhone mobile devices in 2008. Since then there have been over 1 million apps created with more than 60 billion downloads. Google soon caught on to this wave and launched its own app store, as did Microsoft. Apple set forth a trend defining how people interact with software on their mobile devices.
Analytics software on a mobile device should follow this same paradigm. If you want to analyze emergency wait time usage, readmission rates, or clinical coding, the phrase “there’s an app for that” should come to mind. The user experience should be comparable, if not superior, to that provided by other popular apps like Angry Birds, or Google Maps.
In a study completed by Flurry which analyzed time spent on mobile devices in the US, the firm concluded that consumers spent approximately 2 hours 42 minutes each day in 2014. The time spent was broken down into 2 categories: Mobile-Web and Apps. Unsurprisingly, US consumers spent 86% of their time interacting with their mobile device via Apps and only 14% via the mobile web browser.
Based on this evidence, there are several key components to making a successful analytics application on a mobile device:
- Native gestures supported by the device such as swipe, pull down, pinch and expand should be utilized as much as possible when interacting with the visualizations
- The user interface should refactor itself automatically and take advantage of different screen sizes to offer the best user experience (tablet vs. smartphone vs. watch)
- Interactivity with the visualizations and data should be instantaneous. For example, filtering on a field such as year or region should have almost no delay
- To accommodate slow or spotty network connections, the data should be cached on the device as much as possible or completely provided in an offline mode and synchronized
- The app should utilize and integrate with an organization’s mobile device management (MDM) infrastructure
- Security features such as local data encryption on the device, expiration of content, application pin and others should be provided and made configurable to the organization’s standards
Analytics vendors who follow these principles will see much better user adoption rates than those who try to provide a “one-size-fits-all offering” (defined as a single solution for desktops, tablets, and phones typically a web browser interface). The latter option is undoubtedly a more attractive choice for the analytics vendor given the ease in reusability, lower development costs and less maintenance. However, such an offering in the long run will suffer a much higher cost – lower user adoption.