Real Time Search and Its Roots in Social Networking

Real Time Search and Its Roots in Social Networking

Social networks have taken off on a massive scale. With the rise of social networks a new search engine algorithm needs to be developed that is able to scour through the massive amount of data generated on a minute by minute basis. Traditional search engines (such as Google) have limited capabilities when it comes to scouring the internet for updated information. Companies have to submit their sites so that they are found and then Google indexes the site depending on the amount of relevant material that references the site (a vague explanation – but an explanation none the less). This is not really considered real time search as the information is built upon over the course of the website’s life.

Google have long been the masters of traditional search methodology as it relates to the internet. The company wants to expand its horizons to offer support for real time search – that is the ability to catch relevant information that is specific to any niche the second it is posted somewhere on the web. Although problems arise from this – keywords are important but how do we know if the keywords that are relevant to the information that is caught is from a trusted or authoritative source? Well, this is one of the problems that Google is working on.

One method that they have talked about using is to couple the data with the person who has written it – for instance, if the person who has written the data has a high volume of friends or visitors to their profile (as it pertains to social networking) then there is a good chance that the information is going to be somewhat important or relevant. Although, this is a bit of a bandaid fix and while it may work for traditional … Read More

Probability Based Processing

Probability Based Processing

Probability-based mathematics is based on the best odds or matches rather than using switches that are represented as binary numbers (ones and zeros). The central processing unit on the computer that you use (unless you are from the future and already have a processor that uses probability-based processing) uses binary. It counts, computes and reads binary – numbers that are made up using a range of ones and zeros. Sounds like a lot to take in? Relax; it is pretty easy (in theory). You see, we have a range of number systems – not just a base 10 system (that is numbers represented with zero all the way to ten). Binary is essentially a base 2 number system as it only uses zero and one. All numbers (well virtually anything really) can be represented as a collection of zeros and ones.

This is how binary processing works. Computer chips are developed that process collections of ones and zeros. Software is written then converted in machine code that the processing unit in the computer understands. So in order to really take computer processing forward we need to develop hardware (namely computer chips) that read more than just a collection of ones and zeros (binary). That is where Lyric Semiconductor comes into the equation – they have developed what is likely to be the first computer chip that processes using probability rather than binary.

This enables the chip to process data based on probability or chance rather than a stream of binary. This promises to produce faster processing components that are also more energy (and data) efficient. If you consider that the processor is able to process blocks of straight data based on probability or chance rather than what are essentially flags represented by a stream of binary than it starts to … Read More