Current location - Trademark Inquiry Complete Network - Trademark inquiry - Big Data Era: Five Business Analysis Technology Trends
Big Data Era: Five Business Analysis Technology Trends

Big Data Era: Five Business Analytics Technology Trends

Currently, trend centers are as focused on how to deal with analytics challenges as they are on how to make full use of opportunities in new business perspectives intensity. For example, as more and more companies begin to deal with large amounts of data and consider how to use this data, technology is beginning to emerge to manage and analyze large and disparate data sets. Analyzing cost and performance trends in advance means companies can ask more complex questions than before, providing more useful information to help them run their businesses.

In the interview, CIOs summarized 5 major IT trends that affect their analysis. They are: the growth of big data, rapid processing technology, the falling cost of IT commodities, the popularity of mobile devices and the growth of social media. 1. Big data

Big data refers to very large data sets, especially those that are not neatly organized and cannot fit into traditional data warehouses. Web spider data, social media feeds and server logs, as well as data from supply chains, industries, surrounding environments and surveillance sensors are all making a company's data more complex than ever.

Though not every company needs the technology to handle large, unstructured data sets. Perry Rotella, chief information officer of Verisk Analytics, believes that all chief information officers should pay attention to big data analysis tools. Verisk helps financial companies assess risks and work with insurance companies to prevent insurance fraud. Its revenue in 2010 exceeded $1 billion.

Rotella believes that the attitude that technology leaders should adopt is that the more data, the better, and welcome the substantial growth of data. Rotella's job is to find connections and models between things in advance.

HMS Chief Information Officer Cynthia Nustad believes that big data presents an "explosive" growth trend. HMS's business includes helping to control Medicare and Medicaid program costs and private cloud services. Its clients include more than 40 state health and human services programs and more than 130 Medicaid managed plans. HMS helped its customers recover $1.8 billion in losses in 2010 by preventing erroneous payments, saving billions of dollars. Nustad said: "We are collecting and tracking a lot of material, including structured and unstructured data, because you don't always know what you are going to find in it."

The most talked about big data technology One of the technologies is Hadoop. The technology is an open source distributed data processing platform originally developed for tasks such as editing web search indexes. Hadoop is one of several "non-relational (NoSQL)" technologies (including CouchDB and MongoDB) that organize web-scale data in special ways.

Hadoop can allocate subsets of data to hundreds or thousands of servers for processing. The results reported by each server will be organized by a main job scheduler, so it has the ability to process petabyte-level data ability. Hadoop can be used both for data preparation before analysis and as an analysis tool. Companies without thousands of idle servers can purchase on-demand access to Hadoop instances from cloud vendors such as Amazon.

HMS is exploring the use of NoSQL technology, although not for its large Medicare and Medicaid claims databases, Nustad said. It includes structured data and can be processed by traditional data warehouse technology. She said it would be unwise to start with traditional relational database management when answering which relational technology has proven to be the best solution. However, Nustad believes that Hadoop is playing an important role in preventing fraud and waste analysis and has the potential to analyze patient records reported in various formats.

In the interviews, the interviewed CIOs who had experienced Hadoop, including Rotella and Shopzilla CIO Jody Mulkey, all worked in companies that included data services as a business.

Mulkey said: "We are using Hadoop to do things that we have used data warehouses to do in the past. More importantly, we have obtained practical and useful analysis technology that we have never used before." For example, as a company Compare shopping sites, Shopzilla accumulates terabytes of data every day. He said: "Before, we had to sample the data and classify the data. This workload was very heavy when processing massive data." Since adopting Hadoop, Shopzilla can analyze the raw data and skip many intermediate steps.

GoodSamaritan Hospital is a community hospital in southwestern Indiana that is in a different category. "We don't have what I would consider big data," said Chuck Christian, the hospital's chief information officer. Still, regulatory requirements are forcing it to store entirely new types of data, such as massive electronic medical records. He said this will undoubtedly require them to be able to collect health care quality information from the data. However, this is likely to be achieved within regional or national healthcare associations rather than within individual hospitals such as theirs. Therefore, Christian may not necessarily invest in this new technology.

John Ternent, Chief Information Officer of Island One Resorts, said that the analysis challenges it faces depend on whether "big" or "data" in big data. For now, though, he's cautiously considering using Hadoop instances on the cloud as an economical way to analyze complex mortgage portfolios. The company currently manages eight timeshare resorts in Florida. He said: "This solution has the potential to solve the real problems we are currently encountering." 2. Accelerate the speed of business analysis

Vince Kellen, chief information officer of the University of Kentucky, believes that big data technology is just a big problem in rapid analysis. An element of the trend. "What we're looking for is a more advanced approach to analyzing massive amounts of data," he said. The size of the data isn't as important as analyzing the data more quickly, "because you want the process to be done quickly."

Because today's computing can process more data in memory, it can calculate results faster than searching for data on the hard drive. This is true even if you are only dealing with a few gigabytes of data.

Despite decades of development, database performance has improved a lot by caching frequently accessed data. This technique becomes even more useful when loading entire large data sets into the memory of a server or server cluster, where the hard drive is only used as a backup. Because retrieving data from a spinning disk is a mechanical process, it is much slower than processing the data in memory.

Rotella said that the analysis he performs in seconds now would have taken an evening five years ago. Rotella's company focuses on forward-looking analysis of large data sets, which often involves querying, finding models, and making adjustments before the next query. Query completion time is very important when it comes to analysis speed. He said: "In the past, the running time was longer than the modeling time, but now the modeling time is longer than the running time."

The columnar database server changes the traditional row and column structure of the relational database and solves the problem meet other performance requirements. Queries only access useful columns, rather than reading the entire record and selecting optional columns, which greatly improves performance for applications that organize or measure critical columns.

Ternent warned that the performance advantages of columnar databases require proper application and query design. "In order to make a distinction, you have to ask the right questions in the right way," he said. At the same time, he also pointed out that columnar databases are really only meaningful for applications that handle more than 500 gigabytes of data. He said: "You have to collect data at a scale before a columnar database can work, because it relies on a certain level of repetition to improve efficiency."

Allan Hackney, chief information officer of insurance and financial services giant John Hancock Company It said that in order to improve the analysis performance, the hardware also needs to be improved, such as adding a GPU chip, which is the same as the graphics processor used in the game system. He said: “The computational methods required for visualization are very similar to those used in statistical analysis.

Graphics processors can perform calculations hundreds of times faster than ordinary PC and server processors. Our analysts love this device. ” 3. Technology costs are falling

As computing power grows, analytics technology begins to benefit from falling memory and storage prices. At the same time, as open source software gradually becomes an alternative to commercial products, competition The pressure has also led to further declines in commercial product prices.

Ternent is a supporter of open source software. Before joining IslandOne, Ternent was the vice president of engineering at Pentaho, an open source business intelligence company. Open source determines the scope of involvement. Because mid-sized companies like IslandOne can use the open source application R to replace SAS for statistical analysis. "

Open source tools used to have only basic reporting capabilities, but now they can provide the most advanced predictive analysis. "Open source players can now span the entire continuum, which means that anyone can be able to use them. ”

HMS’s Nustad believes that changes in computing costs are changing some infrastructure choices. For example, a traditional factor in creating a data warehouse is to get the data together into servers with powerful computing capabilities. Process them. Separating analytics workloads from the operating system can avoid performance degradation for day-to-day workloads when computing power is insufficient, says Nustad, which is no longer an option. "As hardware and storage get cheaper, you can have these operating systems handle a business intelligence layer," he said. "Analytics built directly on operational applications can provide answers more quickly by reformatting and loading data into warehouses.

Hackney observes that while price/performance trends favor management costs, these Potential savings will be offset by growing capacity requirements, although John Hancock's storage costs per device have declined by 2 to 3% this year, while consumption has increased by 20%.

< Like all applications, business intelligence is increasingly mobile. For Nustad, mobile business intelligence takes priority because everyone wants Nustad to have personal access to reports on whether her company is meeting its service level agreements, anytime, anywhere. She also hopes to provide the company's customers with mobile access to data to help them monitor and manage their health care expenses, she said: "This is a feature that customers like very much." Five years ago, customers didn't need this functionality, but now they do. ”

For CIOs, catering to this trend will have more to do with creating user interfaces that work for smartphones, tablets and touchscreen devices than with more sophisticated analytical capabilities. Perhaps For that reason, Kellen thinks it's relatively easy: "For me, it's just a small thing. "

Rotella doesn't think this is simple. He said: "Mobile computing affects everyone. Many people are using iPads for work, and other mobile devices are exploding. This trend is accelerating and changing the way we interact with computing resources within companies. For example, Verisk has developed products that allow claims adjusters to quickly perform analysis in the field, so they can perform replacement cost assessments. "This approach has made a difference in our analysis and also makes it accessible to everyone who needs it," he said. Anyone can use it easily. ”

Rotella said: “The factor that triggers this challenge is the speed of technology update. Two years ago, we didn't have iPads, and now many people are using iPads. As multiple operating systems emerge, we're trying to understand how they impact our development so we don't have to write these applications over and over again. On the other hand, the need to create native apps for each mobile platform may be receding as browsers on phones and tablets become more powerful, IslandOne's Ternent noted. "If I could use a web-based application specifically for mobile devices, I'm not sure I would invest in a custom mobile application," Ternent said.

” 5. The addition of social media

With the rise of social media such as Facebook and Twitter, more and more companies want to analyze the data generated by these websites. The newly launched analysis applications support human language processing, Statistical techniques such as sentiment analysis and network analysis are not part of the typical business intelligence tool suite.

Because they are new, one example of this is the availability of many social media analytics tools. Radian6. Radian6 is a software-as-a-service (SaaS) product that was recently acquired by Salesforce.com. Radian6 is a social media dashboard for comments on Twitter, posts on Facebook, posts and comments on blogs and discussion boards. Specific terms mentioned are shown as positive and negative numbers, especially to provide vivid visual inferences for brand names. When purchased by the marketing and customer service departments, this type of tool no longer has a heavy dependence on the IT department. Currently, the University of Kentucky. Kellen still believes he needs to pay close attention to them: "My job is to identify these technologies, evaluate which algorithms are suitable for the company based on competitiveness, and then start training the right people.

Universities, like other companies, are interested in monitoring their institutions' reputations. At the same time, Kellen said he may also look for opportunities to develop applications specifically designed to address the schools' concerns. , such as monitoring student enrollment. For example, monitoring students' social media posts can help schools and administrators understand the troubles students encounter in college as early as possible. Kellen said that Dell has already done this work. The product allows companies to detect people's tweets about faulty laptops. He said IT developers should also find ways to push alerts from social media analysis into applications so that companies can respond quickly to related incidents. p>

Hackney said: “We don’t have the know-how or the tools to process and mine the value of massive social media posts. However, once you have collected the data, you need to be able to obtain enough information about company events to correlate them. While Hackney calls John Hancock's efforts in this area "in their infancy," he believes IT departments will play an important role in the data correlation provided by social analytics services on company data. For example, if social media data shows that the company Social commentary in the Midwest is increasingly negative, so he will want to see if the company makes price or strategy changes in the region that will reverse that negative trend, Hackney said. The point of quasi-correlation is to convince company leaders that investing in social media has a high return, he said: “In my industry, everyone is an actuary, everyone is calculating, and they don’t base anything on it. Take it for granted. ”

The above is the relevant content shared by the editor about the era of big data: five major business analysis technology trends. For more information, you can follow Global Ivy to share more dry information