This talk about cable ratings got me curious so decided to do a few sums to see what the rate of temperature rise might be. Came up with this spreadsheet which allows the voltage drop, power dissipated and rate of temperature rise to be calculated. Thought it worth posting as the results surprised me a little (so let me know if you find any bugs in it) - the rate of temperature rise being lower than expected.
Here is the result for 4m of 35mm2 at 500A:
TempRise.png
So a voltage drop of 0.96V (0.24V/m) and a power loss of 480W (120W/m).
The bit that surprises me is the rate of temperature rise, only 1C/sec. It's worth noting that the spreadsheet doesn't allow for the effect cooling at all (from cable or to end fittings), so it only relevant for short time periods and longish lengths, but at this level it looks like bursts of 180kW would present no problem at all.
Just as a sanity check I compared it to this data sheet
https://hilltop-products.co.uk/amfile/f ... uct/30146/ which indicates that 35mm cable can handle 2000A for 3sec (shortest duration should give the best agreement as spreadsheet doesn't include effect of cooling) at 140C. The spreadsheet gave a temp rise of 16C/sec so 4x16=48+140C = 188C which is pretty close to the 180C max operating temperature for that cable so it seems to add up

You do not have the required permissions to view the files attached to this post.