The cell manufacturers recommend letting it cool from the high current discharge before charging, so temperature is an issue, but I'm curious why you believe it has more to do with dropping in temperature, rather than at room temperature or something.
Mostly because I'm paying much closer attention to the situation at a deeper level, going down to pack construction and which cells I use. I'm typically after a pack build that simply doesn't increase in temperature from high current discharge. For example, you can use Samsung 25R cells and flog the bejesus out of them and that cell is known not to heat up. Its only drawback is its not particularly energy dense so you need more cells to make a high capacity pack... but if you make a 14S10P 25R pack there's pretty much nothing you can do to heat it up because its so big. So yeah sure if I had a 14S3P pack of Samsung 30Q's, which are meant to run hot and they really, really do, then you are going to have a whole different set of rules to play by.
Also, I charge typically at 1 amp or less. So my charging level is going to be so low I will not be heating up the pack from charging. Take that into account and the cells chosen, and on a 100-degree day I know from my onboard temp sensor my cells are nowhere near the thermal runaway temp in the neighborhood of 140 degrees... if the pack is decreasing in temperature then its going to keep decreasing. 1a of current being fed into it isn't going to move the needle on heat (volts x amps = watts so I am feeding in only about 50 watts into a 25-30ah pack... peanuts).
And if the pack is running roughly at ambient temp +5 degrees fahrenheit at most when its 100+ out... that means on a 60-degree day my pack is at 60-62 (not a guess since I have that temp sensor). Thats not a temperature to be in any way concerned about..
How would this balance charge the pack? Any cell balancing (if present) would be handled by the BMS internal to the pack, since the charging & discharging only use the ends of the series of cells.
The BMS only balances the pack when the charge level reaches the ballpark of 97 or 98%. So if you are charging to 80% regularly, your BMS is not balancing. There are some very rare bluetooth BMS' that allow balancing at lower levels but they may as well be unicorns. So if you are doing 80% charges, and doing them daily, a once-a-month balance charge at 100% is a decent preventative. Note a balance charge means you have to let the pack sit for a few hours at 100% while connected. I usually use my Satiator for them as you can see it flick on and off at decreasing frequency as the cell groups balance.
1 A is really slow and ultra-conservative. 0.2 A is ridiculously so for these 18650 cells with at least 4 in each parallel cluster. That would equate to a 0.033 A (33 mA) charge current per cell. There's no benefit in charging that slow. Even charging at 1 A would yield a 167 mA charge current; that also is a trickle charge.
No. Less current is less heat into the pack. And low current is less risk in general for anything unforeseen. Yes I am decreasing a small risk to a really small risk. 0.2a at say 55v equates to putting 11w of current thru the wire. If I have a pre-determined route, and I know I am going to set the bike in a garage for 9 hours lets say (all of the work day), and if its charging that full 9 hours and when I get back down to it it'll have about 54.5v, which is plenty to get me home, AND charging at such a measured rate eliminates any concern about overcharging since such a low rate makes it impossible for that to happen, where is the problem? I'm taking advantage of my schedule and doing only what is necessary to do the job I need.
That pic of the power supply I posted above was in my office garage, and it was feeding in 0.30a as you can see. If I had to make a midday run to the bank, I'd kick it up to 0.50a or maybe even a whole 1.0a. There's no need for more as, again, I know my route and my power needs and these rates meet them.
As you alluded to, it's mostly about heat. One could argue that it's better to let the pack cool to room temp, then charge @ 4 A than to charge a warm pack at 2 A.
What do you have that is concrete to back this specific process up? I think this kind of approach only has merit if you are not fully aware of your real time pack temps, pack construction and cell type in the first place, so you fudge your process in both directions (cooling and heating) in the hopes of creating something that doesn't hurt too bad.
I mean... how hot do you think a battery pack gets? If its so hot you genuinely need to cool it off, then in my mind that means there was a failure (or a cost compromise) much earlier on in the build process.