I think its something Leica dreamed up to cream more money off their followers. Whatever perceived sharpness difference you can see in their camera should be easily replicated in post production surely?
No.
With a Bayer sensor, the Colour Filter Array (CFA) means that you have to interpolate (guess) the luminance/chroma values of individual photosites from those surrounding them.
Bayer Array
RG
GB
So, you're measuring the Red value in the Red photosite, but you have no information about the level of Blue or Green light at that point. Similarly, you only measure the amount of blue light at the Blue photosite and have no information about what the levels of Green and Red light are at that point, etc.
With a panchromatic sensor (no CFA) you're measuring the luminance level, across the whole spectrum, at every point on the sensor, so no interpolation is required.
A Bayer array has two Green photosites for every one of Red and Blue, since that is where a larger proportion of the luminance data for human vision comes from. This is also why you get more noise in the red and blue channel when you examine an image from a Bayer array camera in Photoshop - the green channel started out with twice as much information as the others.
Moreover, since you're filtering out two thirds of the light spectrum at every point you're measuring it in order to gather colour information with Bayer, you're losing that much sensitivity in your sensor compared with a naked (panchromatic) photosite - put another way, it's noisier.
Practical example
http://diglloyd.com/blog/2007/20070727_1-Monochrome_vs_Color.html
One downside of a monochrome sensor is that you lose any latitude for over exposure. Once a photosite is maxed out, it's blown. With Bayer (and others with a CFA) there may be some residual information from the other colour channels with which to reconstruct a partial image, even if one or two of the colour channels are blown.