Indeed it's a kind of weird request lol.
I'm sorry but I may not explain very well and won't be accurate since I don't have much knowledges in technical stuff like this.
It's not USB, it is a mini-HDMI wire that connects the hwtools PE4H 2.4 or 2.0a to the expresscard 34mm (=mPCIe which is equivalent to PCI express X1).
I no longer use PE4H 2.4 but a PE4L 2.1b PM060A from hwtools and I use a Xbox PSU to power my gpu.
My setup :

I use an mPCI-e port but it's a gen 2 (5GT/s per lane so 500MB/s according to wiki) which has been introduced for laptop with sandy bridge.
eGPU users like me would get a nice performance increase on 16bit only when running on internal monitor (doesn't affect performance on external monitor) which is done by NVIDIA Optimus driver which provides a transparent internal LCD cloning mode as long as you have an intel iGPU.
So I can play with my laptop LCD but at a cost of some performances, since the GPU is not physically connected to the laptop calculated images has to go back to the laptop, thus it also consume a huge amount of bandwidth which is normally only dedicated to "data" that goes to the GPU for image calculations.
On an external LCD performance is very good I easily have 70-75% of the performance expected on a desktop with similar component.
On the internal LCD we have to deal with a performance loss in general and a FPS cap, which on my setup is of 50 in 1920 1080 and 32bit mode. If I set 1600x900 I get 72fps at max which is 44% more than 50fps in full HD because full HD has 44% more pixels. But in 16bit in 1920 1080 I believe the FPS cap would be at least 85fps since I had 45fps in MSI Kombustor benchmark in 32bit but in 16bit it was 85fps.
An other guy from the eGPU community had 72% FPS increase in resident evil 4 and and 25% in devilmaycry 5 DX10 and nearly 40% in unigine heaven DX11.
He also said that "I think that 24 bit color would be ideal. This would have the same color spectrum of 32bit, but knock off the rarely used transparency bits that 32bit has, which really just waste bandwidth."
If you are interested in DIY eGPU for notebook, you can visit the techinferno forum, and look for "DIY eGPU experiences [version 2.0]" thread, there are tons of information and is much better explained.
24bit would be the best of both world for most game, better performance for no visual hit. But in Skyrim, I honestly see no difference when looking at my screenshots in 16 bit mode.
As you say most recent game doesn't natively support 16 bit, and if you could it do it for skyrim the same way you will be doing for supersampling that would be so super great
