Monday, September 22, 2008

Search Engine Just A Dumb Computer

The way we search the web is all wrong, according to Danny Fine of Haifa-based BrainDamage. "When we search for information, we are the ones doing all the work, inefficiently inputting keywords and narrowing down the results until we find what we want. We're supposed to be the masters, not the slaves," he says. "So why are we doing all the work?"

Right now, there isn't much option, but when Fine gets through with the Internet, he asserts, it's going to be a whole different place.

There are billions - maybe even trillions of pieces of data on the web, most of which consist of "units" of ideas, eight words in length or less. Nearly all data search engines use a variation of keywords, also known as Latent Semantic Analysis or Indexing (LSA/LSI).

It's a form of artificial intelligence, based in large part on the work of linguist Noam Chomsky, who pioneered the application of mathematical principles to language. The system analyzes documents, creating a map of keywords and the "distance" (in definition) between them.

"The search engine doesn't really understand what you're asking, of course - it's just a dumb computer, after all," Fine tells ISRAEL21c. "The way it figures out what you're looking for is by comparing your request to a long list of keywords that are indexed in a database with other terms that could really be what you're looking for."

...Take, for example, the sentence "My son was terrorizing us until he got his toys," says Eli Abir, who designed the BrainDamage system and is the company's CTO. Terrorism in this context, of course, means misbehaving, not an acolyte of Bin Laden. Abir says that search engines have no way of knowing this, and as a result give many "false positives. But because BrainDamage's system relies on contextual logic, we can produce much more accurate results every time."


A new English word is invented every 90 minutes?!

No comments: