Sorry, but I'd really argue against the mphidflash approach you've described ... I do understand the reasoning, but it really limits the end user to only being able to use the USB as a virtual COMport.
While I know none of the chipKIT boards come with a true (non-FTDI) USB connector at the moment (a USB "shield" is coming out this summer for the chipKIT max32 but never for the uno32 I guess), we surely want to be able to do all the tricks this shield will enable for the max32 on our UBW32 and CUI32 boards.
So instead, I would recommend only doing the following if a "USBSerial.begin(9600)" (extrapolated from the standard Arduino "Serial.begin" but specifically aimed at the PIC32's built-in USB) is found in the user's Arduino sketch:
ChipKIT libraries that are linked into the sketch/application include the Microchip USB stack operating as a CDC device, hooked into the serial library of the ChipKIT core functions,
But without needing this:
set to run in INTERRUPT mode. (so that a while(1) in the sketch doesn't disable the ability to reset the processor)
There would really be no need to make this modification to the Microchip USB stack (so that a toggle of one of the hardware handshaking lines resets the processor) in this scenario, as certain user sketches may disable interrupts anyway.
I do realize that I'm suggesting we go without auto-reset here, but why should we not allow people to utilize the full hardware functionality of the CUI32/UBW32 via MPIDE? I would much prefer people to be limited only by the hardware design rather than such decisions made for them in the software end of things...
What do we all think?