AC is supplied to the car at the usual mains domestic voltage, 240V give or take a bit & the cables are often 10mm, 3 cores for single phase, 5 cores for 3-phase, plus a smaller comms cable for the car to tell the unit to stop sending electricity.
DC chargers are a bit more complex & most of the ones around at the moment supply around 400V DC, the limiting factor often being the car which may demand no more than 125 amps as an example (usual to find that’s the limiting factor using a 50kW rapid charger with cars that charge at upto 50kW) The voltage also varies with battery charge levels. Cables on a 50kW rapid charger are quite thick.
When you get onto a 150kW rapid charger the cables get thicker, & start being a problem, they are also usually cooled inside the cable.
Then you find the 350kW chargers & the cables are no larger than the 150kW ones & wonder what’s going on, but yes, rather than supplying at around 400V DC, the only cars that get over 175kW are the 800V DC ones such as that Porsche Taycan. These DC chargers rely on the car to tell them how much current & volts they want supplied during the charging process, hence charging gets slower when the battery gets warm & the car tells the charger to supply less.
So you’re right, it’s simple physics, increase the voltage to keep the cables sensible.