Страницы

четверг, 2 февраля 2012 г.

FBI plans social network map alert mash-up application

The FBI is seeking to develop an early-warning system based on material "scraped" from social networks.

It says the application should provide information about possible domestic and global threats superimposed onto maps "using mash-up technology".

The bureau has asked contractors to suggest possible solutions including the estimated cost.

Privacy campaigners say they are concerned that the move could have implications for free speech.

The FBI's Strategic Information and Operations Center (SOIC) posted its "Social Media Application" market research request onto the web on 19 January, and it was subsequently flagged up by New Scientist magazine.

The document says: "Social media has become a primary source of intelligence because it has become the premier first response to key events and the primal alert to possible developing situations."

It says the application should collect "open source" information and have the ability to:

  • Provide an automated search and scrape capability of social networks including Facebook and Twitter.
  • Allow users to create new keyword searches.
  • Display different levels of threats as alerts on maps, possibly using colour coding to distinguish priority. Google Maps 3D and Yahoo Maps are listed among the "preferred" mapping options.
  • Plot a wide range of domestic and global terror data.
  • Immediately translate foreign language tweets into English.

The FBI document says the information would be used to help it to predict the likely actions of "bad actors", detect instances of people deliberately misleading law enforcement officers and spot the vulnerabilities of suspect groups.

An FBI spokeswoman told the BBC that the software being researched was "no different than applications used by other government agencies" and that "the application will not focus on specific persons or protected groups, but on words... and activities constituting violations of federal criminal law or threats to national security."
Privacy permissions
The FBI issued the request three weeks after the US Department of Homeland Security released a separate report into the privacy implications of monitoring social media websites.

It justified the principle of using information that users have provided and not opted to make private.

"Information posted to social media websites is publicly accessible and voluntarily generated. Thus the opportunity not to provide information exists prior to the informational post by the user," it says.

It noted that the department's National Operations Center had a policy in place to edit out any gathered information which fell outside of the categories relevant to its investigations.

It listed websites that the centre planned to monitor. They include YouTube, the photo service Flickr, and Itstrending.com - a site which shows popular shared items on Facebook.

It also highlighted words it looked out for. These include "gangs", "small pox", "leak", "recall" and "2600" - an apparent reference to the hacking-focused magazine.
'Dragnet effect'
The London-based campaign group, Privacy International, said it was worried about the consequences of such activities.

"Social networks are about connecting people with other people - if one person is the target of police monitoring, there will be a dragnet effect in which dozens, even hundreds, of innocent users also come under surveillance," said Gus Hosein, the group's executive director.

"It is not necessarily the case that the more information law enforcement officers have, the safer we will be.

"Police may well find themselves overwhelmed by a flood of personal information, information that is precious to those it concerns but useless for the purposes of crime prevention."

The group noted that it was seeking information from the UK's Metropolitan Police Service about its use of social networks.
http://www.bbc.co.uk/news/technology-16738209

среда, 21 декабря 2011 г.

Social Media versus Knowledge Management

On the surface, social media and knowledge management (KM) seem very similar. Both involve people using technology to access information. Both require individuals to create information intended for sharing. Both profess to support collaboration.
But there's a big difference.
  • Knowledge management is what company management tells me I need to know, based on what they think is important.
  • Social media is how my peers show me what they think is important, based on their experience and in a way that I can judge for myself.

But, really, is that anyone's KM reality?
KM, in practice, reflects a hierarchical view of knowledge to match the hierarchical view of the organization. Yes, knowledge may originate anywhere in the organization, but it is channeled and gathered into a knowledge base (cistern) where it is distributed through a predefined set of channels, processes and protocols.
Social media looks downright chaotic by comparison. There is no predefined index, no prequalified knowledge creators, no knowledge managers and ostensibly little to no structure. Where an organization has a roof, gutters and cistern to capture knowledge, a social media organization has no roof, allowing the "rain" to fall directly into the house, collecting in puddles wherever they happen to form. That can be quite messy. And organizations abhor a mess.
It is no wonder, then, that executives, knowledge managers and software companies seek to offer tools, processes and approaches to tame social media. After all (they believe), "We cannot have employees, customers, suppliers and anyone else creating their own information, forming their own opinions and expressing that without our say. Think of the impact on our brand, our people, our customers. We need to manage this. We need knowledge management."
This is exactly the wrong attitude for one simple reason: It does not stop people from talking about you. Your workforce, customers, suppliers, competitors, etc., will talk about you whenever, wherever and however they want. Even pre-World Wide Web, these conversations were happening.
We're long past the time to seek control; it's time to engage people.
Business leaders recognize that engagement is the best way to glean value from the knowledge exchanged in social media — and not by seeking to control social media with traditional KM techniques. That only leads to a "provide and pray" approach, and we have seen more than our share of "social media as next-generation KM" efforts fail to yield results.
So how do organizations gain value from social media, particularly in situations where they have not been successful with KM? The answer lies in a new view of collaboration: mass collaboration.
Mass collaboration consists of three things: social media technology, a compelling purpose and a focus on forming communities.
  • Social media technology provides the conduit and means for people to share their knowledge, insight and experience on their terms. It also provides a way for the individual to see and evaluate that knowledge based on the judgment of others.
  • Purpose is the reason people participate and contribute their ideas, experience and knowledge. They participate personally in social media because they value and identify with the purpose. They do so because they want to, rather than being told to as part of their job.
  • Communities are self-forming in social media. KM communities imply a hierarchical view of knowledge and are often assigned by job classification or encouraged based on work duties. Participation becomes prescribed, creating the type of "mandatory fun" that is the butt of many a Dilbert cartoon and TV sitcom. Social media allows communities to emerge as a property of the purpose and the participation in using the tools. This lack of structure creates the space for active and innovative communities.
The point here is that while they may seem similar, social media and KM are not the same. Recognizing the differences is a crucial step toward getting value out of both and avoiding a struggle of one over the other. 

пятница, 9 декабря 2011 г.

An introduction to advanced analytics – What is text mining?

Source: http://www.joobworld.com/blog/2011/10/an-introduction-to-advanced-analytics-what-is-text-mining/
Data: 30.10.2011
Join one of our lead Intelligence technologists, Karl Oaks to learn how to ‘extract nuggets of value’ from text mines….
The purpose of this blog will be to give a high level intro into a number of the advanced analytics technologies that are keeping us occupied here in the technology group at JOOB HQ.
Current research and development activities have us focused on a few key areas under the advanced analytics banner; these include machine learning, text mining, time series analysis and network analysis – with the idea being to apply aspects of these to bring additional value to the range of interesting investigative applications we are developing.
With that said, in this blog post we will be diving (or at least getting our toes wet) into technologies on the text mining side of things. So what is text mining? In a way its not too dissimilar to any other kind mining, where you are typically focused on extracting nuggets of value from a mine; in our case the value is information and the mine is the text itself. The text itself might take the form of documents, emails, tweets, forum posts or even blogs like this.
The wider area of text mining is broken down into a number of sub areas; but the areas we are specifically working in are entity extraction, concept extraction, document clustering and sentiment analysis.
Lets look a bit closer at each of these; first entity extraction. Entity extraction is the automated process of identifying and reporting on the “entities” within a block of text. These “entities” might be people, places, organisations, dates, phone numbers, addresses etc and depending on the type of entity you are interested in there are several different techniques that can be applied to more accurately identify these.
For example, types such as email addresses and phone numbers have common patterns and as such can be accurately identified through regular expressions. Providing knowledge in the form of gazettes, taxonomies or unambiguous lists of words to represent entities such as people or countries is another technique. We have started employing more sophisticated techniques, that leverage statistical and machine learning models – such as Conditional Random Field and Maximum Entropy which we might cover in another blog post.
Next up we have concept extraction; in a way this is going one level above entity extraction, by extracting higher level concepts, rather than specific words, or making sense of the word within the sentence. In order to achieve this we have a knowledge base behind the technology; which it refers to for its higher level concepts, as well as a mechanism to provide the word sense disambiguation – or in other words clarifying the meaning in the particular sentence. One such knowledge based we currently utilise is a comprehensive Wikipedia knowledge base, where concepts are organized and structured according to the relationships among them. What is even more beneficial is that the Wikipedia knowledge base is updated frequently, which means our tools are also always up to date – talk about harvesting knowledge from web!
Once we have extracted our entities and our concepts we now have points of comparison within our set of documents. This is where document clustering comes in; we are able to construct clusters, or groupings based on concepts/entities extracted from the set of documents. This is possible because we can measure the relatedness of each of the concepts and entities. For example, an apple has a higher relatedness score to an orange compare to an aeroplane. This might sound easy, but I can assure you that the algorithms and techniques used to achieve this pretty sophisticated
Document or text clustering can be useful as a larger scope relatedness measure for your documents; meaning documents that are in the same cluster, largely share the same concepts/entities. With these clusters in place; if we are given a new document, and we cluster this against the existing clusters it will naturally align with a particular set of documents based on its content. Neat huh?
The last area of text mining we will touch on here is sentiment analysis. The purpose of sentiment analysis is to attempt to determine the attitude of the speaker, or writer in the case of text mining ,when discussing a particular topic. This can be approached in a number of different ways; but in our case we have built a machine learning model for classifying whether the person is speaking negatively, positively or neutrally on a given topic. Benefits of this can be seen through automatically detecting whether someone is talking about your product, person, place in a positive or negative way.

четверг, 8 декабря 2011 г.

12 Ways to (Legally) Spy on Your Competitors

12 Ways to (Legally) Spy on Your Competitors

Source: www.entrepreneur.com
BY Carol Tice

Ever wonder what your competitors are up to?

You should. They might be creating new products, planning to enter new markets -- or maybe they're floundering. If you knew, it could give you an edge.
Uncovering competitive information doesn't require donning a trench coat or hiring a computer hacker. There are plenty of perfectly legal ways to get below-the-radar competitive information.
Here are some time-tested methods that predate the Internet, as well as newer techniques to mine the wealth of information readily accessible online.

Old School

1. Read the local papers. Subscribe to the daily newspaper and business weekly in the cities where your primary competitors are based. You'll be surprised what competitors might say when they think they're just talking to a small, local audience.
"I cannot tell you the information we've gotten this way, in regular articles, about inventory, staffing, new plants and expansion plans," says Seena Sharp, Los Angeles-based principal at Sharp Market Intelligence. For instance, one of Sharp's clients in the garden-products industry learned exactly how a plant fire had affected a competitor, the capacity of the rebuilt plant and the marketing plan for the next year, all from a local newspaper. With this knowledge, the client crafted a strategy that countered the competitor's efforts and increased the client's market share.

2. Tap your vendors. Product suppliers and service providers talk regularly with all their clients. If you're on good terms with your vendors, Sharp says, chat them up and see what you can get them to spill about your competitors. Don't be pushy, though. Keep the conversation casual.

3. Go to trade shows. You can stand near competitors' booths at a busy time when it's easy to blend in with the crowd and eavesdrop on what they tell prospects. New initiatives often are announced at shows, Sharp notes and chatty salespeople may reveal details. If you think you'll be recognized, send an employee or friend to listen.

4. Take a plant tour. For manufacturing competitors, see if the plant gives tours. Sharp says tour guides often brag about new products, new hires and expansion plans.

5. Play secret shopper. If competitors have stores, stroll the aisles and observe whether employees are responsive and facilities are clean--or shelves are empty and store phones go unanswered. Call the order line, too, so you can evaluate customer service, advises Sean Campbell, principal at competitive-research firm Cascade Insights in Oregon City, Ore.
6. Browse public documents. Publicly held companies must file reports with the U.S. Securities and Exchange Commission. Sharp also likes to read filings with the Environmental Protection Agency, the Patent and Trademark Office and local planning commissions to learn of building expansions and new products. Check with other state and federal agencies for signs of trouble such as tax liens, and comb legal filings for unexpected disclosures.

New School

7. Google your competitor's website. You can reveal hidden pages by doing Google searches such as: "filetype: doc site: companyname," says August Jackson, a senior competitive intelligence analyst for Ernst & Young in McLean, Va. http://www.ey.com/ Change the file type to .pdf, .xls, or .ppt to turn up data or presentations. "It's surprising how many companies put this information up and think, ‘If I don't link to it, no one will find it,'" Jackson says. You also can view the site's source code to see the meta-tags or key words being used to optimize its position in searches.

8. Explore LinkedIn. On LinkedIn, you can sign up to follow a company and get notices when updates are posted on its LinkedIn page. You also can search a company's name on LinkedIn to find former employees and new hires, Jackson says. Salespeople may identify and brag about their clients on their personal LinkedIn page updates. If you're worried the company might recognize and block you, ask a colleague to follow the page.

9. Troll Twitter and Facebook chatter. If members of your industry hang out on Facebook, monitor their conversations. Music-rights agent Jennifer Yeko, president of True Talent Management in Beverly Hills, Calif., says she gets the scoop on the clients her competitors sign and the royalty rates they offer from posts made by her Facebook friends.
Many events have a Twitter hashtag that people use to chat and post speakers' comments live. If a competitor is speaking, tune in. Jackson has had success asking follow-up questions by responding and using the same hashtag.

10. Find competitors' job ads. Job portal Indeed is a great place for sussing out postings because it aggregates listings from many online job boards. Watch the skills a company may be hiring for; they're a leading indicator for new initiatives, says Campbell of Cascade Insights.
"We had a client curious about which American wireless carriers would offer Android phones," he says. "Just looking at job listings you could see who was trying to hire people with Android experience."

11. See Who's on Quora. Popular with techies and venture capitalists, Quora holds a vast database of interesting competitive questions on such topics as a company's future plans. Often, company employees provide the answers, Campbell notes, and they generally reply using their true identities, unlike people on most Q&A sites.

12. Check Slideshare. Companies frequently use this popular portal to share slideshow presentations but forget to take them down, Jackson says. Presentations to potential investors, for example, may contain financial data, forecasts and information about new projects.
One note of warning: When researching online, be sure to consider the source. There are plenty of half-truths, gossip and misinformation online.

четверг, 1 декабря 2011 г.

Gartner Identifies the Top 10 Strategic Technologies for 2012

Source: Gartner, Inc.

Orlando, Fla., October 18, 2011—

Gartner, Inc. today highlighted the top 10 technologies and trends that will be strategic for most organizations in 2012. The analysts presented their findings during Gartner Symposium/ITxpo, being held here through October 20.
Gartner defines a strategic technology as one with the potential for significant impact on the enterprise in the next three years. Factors that denote significant impact include a high potential for disruption to IT or the business, the need for a major dollar investment, or the risk of being late to adopt.
A strategic technology may be an existing technology that has matured and/or become suitable for a wider range of uses. It may also be an emerging technology that offers an opportunity for strategic business advantage for early adopters or with potential for significant market disruption in the next five years. These technologies impact the organization's long-term plans, programs and initiatives.
“These top 10 technologies will be strategic for most organizations, and IT leaders should use this list in their strategic planning process by reviewing the technologies and how they fit into their expected needs,” said David Cearley, vice president and Gartner fellow.
“Organizations should start exploratory projects to look at promised candidate technology and kick off a search for combinations of information sources, including social sites and unstructured data that may be mined for insights,” said Carl Claunch, vice president and distinguished analyst at Gartner.

The top 10 strategic technologies for 2012 include:

Media Tablets and Beyond. Users can choose between various form factors when it comes to mobile computing. No single platform, form factor or technology will dominate and companies should expect to manage a diverse environment with two to four intelligent clients through 2015. IT leaders need a managed diversity program to address multiple form factors, as well as employees bringing their own smartphones and tablet devices into the workplace.
Enterprises will have to come up with two mobile strategies – one to address the business to employee (B2E) scenario and one to address the business to consumer (B2C) scenario. On the B2E front, IT must consider social goals, business goals, financial goals, and risk management goals. On the B2C front, which includes business to business (B2B) activities to support consumers, IT needs to address a number of additional issues such as surfacing and managing APIs to access enterprise information and systems, integration with third-party applications, integration with various partners for capabilities such as search and social networking, and delivery through app stores.

Mobile-Centric Applications and Interfaces. The user interface (IU) paradigm in place for more than 20 years is changing. UIs with windows, icons, menus, and pointers will be replaced by mobile-centric interfaces emphasizing touch, gesture, search, voice and video. Applications themselves are likely to shift to more focused and simple apps that can be assembled into more complex solutions. These changes will drive the need for new user interface design skills.
Building application user interfaces that span a variety of device types, potentially from many vendors, requires an understanding of fragmented building blocks and an adaptable programming structure that assembles them into optimized content for each device. Mobile consumer application platform tools and mobile enterprise platform tools are emerging to make it easier to develop in this cross-platform environment. HTML5 will also provide a long term model to address some of the cross-platform issues. By 2015, mobile Web technologies will have advanced sufficiently, so that half the applications that would be written as native apps in 2011 will instead be delivered as Web apps.

Contextual and Social User Experience. Context-aware computing uses information about an end-user or objects environment, activities, connections and preferences to improve the quality of interaction with that end-user or object. A contextually aware system anticipates the user’s needs and proactively serves up the most appropriate and customized content, product or service. Context can be used to link mobile, social, location, payment and commerce. It can help build skills in augmented reality, model-driven security and ensemble applications. Through 2013, context aware applications will appear in targeted areas such as location-based services, augmented reality on mobile devices, and mobile commerce.
On the social front, the interfaces for applications are taking on the characteristics of social networks. Social information is also becoming a key source of contextual information to enhance delivery of search results or the operation of applications.

Internet of Things. The Internet of Things (IoT) is a concept that describes how the Internet will expand as sensors and intelligence are added to physical items such as consumer devices or physical assets and these objects are connected to the Internet. The vision and concept have existed for years, however, there has been an acceleration in the number and types of things that are being connected and in the technologies for identifying, sensing and communicating. These technologies are reaching critical mass and an economic tipping point over the next few years. Key elements of the IoT include:
  • Embedded sensors: Sensors that detect and communicate changes are being embedded, not just in mobile devices, but in an increasing number of places and objects.
  • Image Recognition: Image recognition technologies strive to identify objects, people, buildings, places logos, and anything else that has value to consumers and enterprises. Smartphones and tablets equipped with cameras have pushed this technology from mainly industrial applications to broad consumer and enterprise applications.
  • Near Field Communication (NFC) payment: NFC allows users to make payments by waving their mobile phone in front of a compatible reader. Once NFC is embedded in a critical mass of phones for payment, industries such as public transportation, airlines, retail and healthcare can explore other areas in which NFC technology can improve efficiency and customer service.
App Stores and Marketplaces. Application stores by Apple and Android provide marketplaces where hundreds of thousands of applications are available to mobile users. Gartner forecasts that by 2014, there will be more than 70 billion mobile application downloads from app stores every year. This will grow from a consumer-only phenomena to an enterprise focus. With enterprise app stores, the role of IT shifts from that of a centralized planner to a market manager providing governance and brokerage services to users and potentially an ecosystem to support entrepreneurs. Enterprises should use a managed diversity approach to focus on app store efforts and segment apps by risk and value.

Next-Generation Analytics. Analytics is growing along three key dimensions:
  1. From traditional offline analytics to in-line embedded analytics. This has been the focus for many efforts in the past and will continue to be an important focus for analytics.
  2. From analyzing historical data to explain what happened to analyzing historical and real-time data from multiple systems to simulate and predict the future.
  3. Over the next three years, analytics will mature along a third dimension, from structured and simple data analyzed by individuals to analysis of complex information of many types (text, video, etc…) from many systems supporting a collaborative decision process that brings multiple people together to analyze, brainstorm and make decisions.
Analytics is also beginning to shift to the cloud and exploit cloud resources for high performance and grid computing.
In 2011 and 2012, analytics will increasingly focus on decisions and collaboration. The new step is to provide simulation, prediction, optimization and other analytics, not simply information, to empower even more decision flexibility at the time and place of every business process action.
Big Data. The size, complexity of formats and speed of delivery exceeds the capabilities of traditional data management technologies; it requires the use of new or exotic technologies simply to manage the volume alone. Many new technologies are emerging, with the potential to be disruptive (e.g., in-memory DBMS). Analytics has become a major driving application for data warehousing, with the use of MapReduce outside and inside the DBMS, and the use of self-service data marts. One major implication of big data is that in the future users will not be able to put all useful information into a single data warehouse. Logical data warehouses bringing together information from multiple sources as needed will replace the single data warehouse model.

In-Memory Computing. Gartner sees huge use of flash memory in consumer devices, entertainment equipment and other embedded IT systems. In addition, it offers a new layer of the memory hierarchy in servers that has key advantages — space, heat, performance and ruggedness among them. Besides delivering a new storage tier, the availability of large amounts of memory is driving new application models. In-memory applications platforms include in-memory analytics, event processing platforms, in-memory application servers, in-memory data management and in-memory messaging.
Running existing applications in-memory or refactoring these applications to exploit in-memory approaches can result in improved transactional application performance and scalability, lower latency (less than one microsecond) application messaging, dramatically faster batch execution and faster response time in analytical applications. As cost and availability of memory intensive hardware platforms reach tipping points in 2012 and 2013, the in-memory approach will enter the mainstream.
Extreme Low-Energy Servers. The adoption of low-energy servers — the radical new systems being proposed, announced and marketed by mostly new entrants to the server business —will take the buyer on a trip backward in time. These systems are built on low-power processors typically used in mobile devices. The potential advantage is delivering 30 times or more processors in a particular server unit with lower power consumption vs. current server approaches. The new approach is well suited for certain non-compute intensive tasks such as map/reduce workloads or delivery of static objects to a website. However, most applications will require more processing power, and the low-energy server model potentially increases management costs, undercutting broader use of the approach.

Cloud Computing. Cloud is a disruptive force and has the potential for broad long-term impact in most industries. While the market remains in its early stages in 2011 and 2012, it will see the full range of large enterprise providers fully engaged in delivering a range of offerings to build cloud environments and deliver cloud services. Oracle, IBM and SAP all have major initiatives to deliver a broader range of cloud services over the next two years. As Microsoft continues to expand its cloud offering, and these traditional enterprise players expand offerings, users will see competition heat up and enterprise-level cloud services increase.
Enterprises are moving from trying to understand the cloud to making decisions on selected workloads to implement on cloud services and where they need to build out private clouds. Hybrid cloud computing which brings together external public cloud services and internal private cloud services, as well as the capabilities to secure, manage and govern the entire cloud spectrum will be a major focus for 2012. From a security perspective new certification programs including FedRAMP and CAMM will be ready for initial trial, setting the stage for more secure cloud computing. On the private cloud front, IT will be challenged to bring operations and development groups closer together using “DevOps” concepts in order to approach the speed and efficiencies of public cloud service providers.

понедельник, 7 ноября 2011 г.

Almaty (Kazakhstan), November 15, 2011. Presentation of a new version of Semantic Archive 4.0

 
New exciting opportunities of a well-known analytical system Semantic Archive: real-time monitoring of online state databases, social networks (Twitter, Yandex.Blogs, Odnoklassniki [Classmates], VKontakte, etc.) , blogs, forums, integration with SPARK, pre-set monitoring of popular Kasakh sites and more!
Analytical Business Solutions presents a new version of Semantic Archive 4.0. The version possessed conceptually new opportunities for automation of analytical services of commercial companies and governmental organizations. The system would be useful for marketing and PR departments, economic security services, departments of competitive intelligence.

Dear colleagues!
We invite you and specialists of your companies to the free presentations to be held on

Tuesday, November 15, 2011 at 3PM
Hotel Shera Residence, Almaty, 42 Kairbekov str. (corner of Gogol street)

Using of the system will significantly decrease amount of routine actions of your employees for Internet search, increase quality of analytical reports, and help you create enterprise-level knowledge archive.

We will show live examples of system utilization.
 We will distribute free materials, including

·         Informational papers, describing the system;
·         Demo version of Semantic Archive;
·         Samples of analytical reports created with the system.


In this version we implemented functions for monitoring of social networks (Twitter, Yandex.Blogs, Odnoklassniki [Classmates], VKontakte, etc.) , blogs, forums. Now the system is integrated with a well-known analytical system SPARK , has pre-set robots for monitoring of more than 200 Russian and foreign media sources, including 20 Kazakh sources, including Interfax- Kazakhstan, Kazakhstan Today, Kazinform, Kazakhstan Pravda and others.

Our specialists will answer all your questions and consult you on using the system.

To register please
call +7 (499) 745-43-83,
or visit our site http://www.anbr.ru/
or email us at seminar@anbr.ru

Location:

 

вторник, 1 ноября 2011 г.

Oracle Buys Enterprise Search And Data Management Company Endeca


Source: http://techcrunch.com
Date: 18.10.2011

Oracle has acquired Endeca, a company that powers enterprise search for large companies. Financial terms of the deal were not disclosed. Endeca has raised a total of $65 million from Bessemer, Venrock, Intel, SAP, Ampersand Capital Partners, DN Capital and Lehman Brothers.
Endeca’s core technology enables companies to correlate and analyze unstructured data and provides enterprise search for large companies including Borders, Boeing, the Census Bureau, the EPA, Ford, Hallmark, IBM, and Toshiba. The company specializes in guided search, and auto-categorizing results based on the keywords someone enters. Endeca charges from $100,000 to more than $10 million per installation.
Endeca’s InFront offering allows businesses with tools for advanced merchandising and content targeting for e-commerce. And Endeca Latitude enables businesses to rapidly develop analytic applications that draw information and data from unstructured and structured sources together.
Oracle says that the combination of Oracle and Endeca is expected to more advanced enterprise data management platform. Companies will be able to process, store, manage, search and analyze structured and unstructured information together. For example, Oracle says the combination of Oracle’s own commerce application, ATG Commerce and Endeca InFront is expected to enhance cross-channel commerce, merchandising, and online customer experiences. And Oracle’s Business Intelligence offering and Endeca Latitude will be combined as well to give businesses a more powerful analytics platform.
Endeca currently has over 600 customers.

Comment (Russin):
Одним из основных продуктов купленной Endeca является MDEX – движок для обработки неструктурированной информации, такой как электронные письма, текстовые файлы, аналитические отчеты и другие цифровые данные, а также продукт InFront CEM (customer experience management).
На сегодняшнем рынке программного обеспечения для бизнеса наблюдается характерная тенденция – корпорации производят все больше неструктурированных данных, генерируемых в разных структурных подразделениях, социальных сетях, корпоративном софте и других источниках. Сейчас почти все основные поставщики бизнес-решений активно работают над выведением решений для процессинга подобных информационных массивов.