Friday, December 30, 2011

Compressed Sensing: An Overview #2 (Group Testing)

Here we will consider a situation where N number of soldiers have to be tested for a particular disease using a sensitive but an expensive blood test. If blood samples from different soldiers were grouped together, the test would be sensitive enough to declare a positive if any of the soldier was infected. If there are K soldiers and where K>> then M = N measurement is to be done. If K=1 (priori) then M=log2 (N) is possible. Here global measurement is taken which is then followed by a reasoning algorithm. This was proposed by two economist during the Second World war for the screening of soldiers infected with syphilis. The proposal was to put the blood samples together as few people were likely to be infected by Syphilis, this saves tests on average. Though test was not put into practice the discussions on the subject is still being continued!! A similar principle is applied in compressive sensing.

The signal is projected onto a random signal of equal length. Each projection refers to a measurement. Each random signal used for measurement in the view of Fourier Series is a linear combination of several frequency component thus is equivalent to the extraction of information regarding several frequencies of the original signal. With adequate measurement the signal can be reconstructed using the L-1 norm optimization algorithm. The accuracy obtained by this is remarkable but is limited to a very strict condition. The signal should be sparse in terms of 'Information content'. Mathematically the number coefficients representing the signal must be small. Luckily most of the signals in nature satisfy this requirement and that is the reason why signals are compressible. 

Thursday, December 29, 2011

Compressed Sensing : An Overview #1

We shall try to get an analogy to the problem "Compressed Sensing"  through a few analogous example.

There are N buckets of gold coins where each of the coins weigh 2 grams except for a bucket which weighs 8 grams. The problem is to identify that particular bucket.

Approaching the Nyquist would be to weigh the coins from each bucket also called the point wise sampling. Otherwise we can number each bucket and accordingly take the coins with respect to the number of the buckets. Bucket 1 would give 1 coin and bucket 2 would give 2 coins up to n coins from bucket number n. 

Thus a total of 10 buckets would give us 55 coins. If all the coins were of 2 grams the total weight would be 110 grams. But if x is the actual weight the defective bucket would be (110-x)/(2-1.8). Thus just one measurement and the bucket number is retrieved. There is a critical assumption made to achieve the solution which is we have only one defective bucket. In fact this is the term 'Sparsity Prior'. Thus a linear measurement server the purpose. One information to retrieve thus a linear combination. More information would call for different linear combinations. This in fact is what is called as Group Testing

Monday, December 26, 2011

Nuit Blanche: Bilateral Random Projections

The blog below would be very helpful to update yourself with the Spars Processing in the signal processing applications.
Nuit Blanche: Bilateral Random Projections: Reminded of it thanks to Sergey , here is the simple but powerful: Bilateral Random Projections by Tianyi Zhou , Dacheng Tao . The a...

Sparse Processing: Time to Unlearn Shannon-Nyquist Theorem?

 Signal Processing: An ever dynamic domain since its evolution keeps changing its course day after day. With the increasing number of functions being pushed to sophisticated software algorithms there is very little room for the circuit level processing. This calls for a high degree of visualization and mathematical thought for the budding up engineers.

 The inevitable technique which when one looks forward for sampling the natural signals to the discrete domain was the all famous Shannon-Nyquist criterion. This calls for a very ideal condition of the band limited signal that would never in the universe ever. As the expansion of the frequency domain would call for a very high compression of the time domain and vice-versa which is the reason for all the errors arising due to the approximation of signals to band limited signals. Though we sample and transform the entire spectra of the signal we see that the required information is concentrated to certain hot-spots in the transformed signal. This is the feature that is well exploited in the convectional compressing technique.

  "Compressive sampling" has emerged. By using nonlinear recovery algorithms, super-resolved signals and images can be reconstructed from what appears to be highly incomplete data. Compressive sampling shows us how data compression can be implicitly incorporated into the data acquisition process, a gives us a new vantage point for a diverse set of applications including accelerated tomographic imaging, analog-to-digital conversion, and digital photography. 

 The recently discussed Sparse technology defines every signal in the universe as Sparsely distributed and aims at the acquisition of these sparsely distributed elements from the signal. Thus saving the energy, memory and also the computational complexity. Interestingly to be well versed in Sparse processing the only mathematical tool is solving the linear system of equation. With a mathematical mind and proper understanding of the L1 - Magic the entire compressive sensing problems can be computed easily. All it requires is a team of efficient mathematicians and engineers with proper visualization to the nature.

 The optimization techniques through this method has proved much efficient and faster than the convectional method. The mathematical proof for the Compressive sensing was provided by a Field Medalist " Terence Tao.

 This arena calls for a wide scope of research as various techniques like Inpainiting, Deionisation etc. which are otherwise complex could be solved by just the usual differential equation of heat.


We shall discuss in much more detail in the upcoming days.

Links that might help you.


Sunday, December 18, 2011

Future of the IT Industry???

 It seems that the traditional outsourcing is almost to its death bed. The Indian story of IT outsourcing has almost lost its sheen. Earlier global IT service providers have helped clients realize cost saving and effective methods through various outsourcing channels like Applications, Infrastructure, Business Process and Offshore Product Development. Indian IT industry created a niche for itself in the outsourcing marketplace by pioneering Global delivery model and developing strong differentiators like the Process Maturity and cost arbitrage. Benefits of outsourcing is obvious but it appears that client dissatisfaction levels too are growing across a varied dimension. Be it quality of service, innovation, speed of response to business changes or cost savings. In a dynamic world like ours today the only constant is change.Indian IT industry has its inflection point attributed to the dot com bubble burst, 2008 financial meltdown, cloud computing technology innovations, mobility and "booming" Indian economy.
 We often tend to look at life in a linear fashion, with the assumption that the past is the best indicator of the future and that the trend under observation will continue to grow (or fall). However, most trends in reality follow an S-Curve – with a slow and steady start, rapid growth, a slowdown, followed by a decline.
 The IT industry of India can be classified to three phases.
  1. Wave 1, which we can trace to the 80s and 90s, clearly established that Indian IT professionals were competent and could be trusted to deliver world-class work. This was the staff augmentation era of the industry, largely serviced through onsite services.
  2. Wave 2, starting off in the mid-90s and currently at its peak, established India as an offshore programming destination. With labor arbitrage as the basic value proposition, Indian companies established large offshore development centers that had competent technical staff, mature CMM processes and world-class infrastructure. While the trigger for Wave 2 was the offshore initiatives by companies like GE, Motorola, Nortel etc., the Y2K bug gave it the necessary momentum. Although things slowed a bit after the dotcom bust, the shrinking IT budgets actually gave an impetus to large Fortune 500 companies to use offshore centers as a mainstream sourcing option.
  3. With rising salaries, the appreciating rupee and recessionary pressures in the US, it is difficult to see the Indian industry continuing to sustain a 40-50% growth rate in the labor arbitrage mode. Hence there is a genuine question whether Indian outsourcing is on the decline.  This is the classic S-Curve in operation.  This calls for the third wave or what is called the "Outsourcing 3.0" according to Kaushik Chatterjee from Wipro Technologies.


 Outsourcing 3.0 or Wave 3.0– which is strategic, value-added and non-linear. Capitalizing on the large pool of technical talent available in India and the free availability of domain experts in the western world, Indian companies need to start making substantial investments in building intellectual property – not necessarily as packaged software, but also as frameworks, components, web services and the like. We need to move up to create solutions that have strategic impact and C-level visibility within client organizations. We need to own significant parts of the transformation initiative budgets and be equipped to convert our CMM advantages to predictable deliveries.
 The the next wave would take Indian software services exports beyond the $100 billion mark! Far from being a death knell for Indian industry, we see the decline in Wave 2 work as a necessary precondition for the emergence of the Third Wave. The industry giants like Infosys, Wipro etc. have already taken the leap to the emerging cloud solutions and other technological attributions.  The only question is the pace of leap required to sustain the Indian leadership over the prevailing threats.

References
  1. "Outsourcing 3.0: Road Ahead for the Indian IT Industry", Kaushik Chatterjee, ITSM 2o11, IPF