forgive me if this sounds like a rant - i'm jetlagged to hell in another wrong timezone.
Converters are WAY more than a shift register and clocks.
Inside an ADC you have a modulator, typically running at 256fS (fS being the sample rate)
If it's a one bit modulator, it's basically comparing the current sample with the last one, and saying if it's higher or lower. There are multi-bit modulators on the market, but they are a little more complex.
From there, the high speed low bitcount data is sent to the digital filter/decimator, which, in laymans terms converts the 1 bit 2.8MHz data down to 44.1kHz 24bit data. (however, this filter typically takes up quite a bit of silicon real estate).
Anyway - point is, depending on the converters used, the output rate may not be achievable based on things like the modulator being too slow, the digital filter being limited (to make the silicon real estate small, and the device cheap).
Add to that, limits elsewhere in the system. For instance, a USB 1.0 device typically struggles to maintain greater than 44.1kHZ. There are a few implementations - however, most of the ones that support 96kHz can only do so in 1 direction. (e.g. Stereo ADC)
I don't think you'll notice a significant difference in running at a higher sample rate. Converters from 10 years ago (that's when the 001 was released) typically had rising noise levels at higher sampling frequencies. The out-of-band noise tends to rise at 40kHz or so.
Ruairi nailed it the best - there are so many more angles to improve before you get to your Digi 001. The converters are typically the cheapest part to improve these days, but if your still using cheap microphones in the spare room of your house, there's a lot more improvement that you could make in the signal chain and the room. Even the best converters in the world will sound like crap if you feed them with crap
Good luck
/R