I am new to the forum and CS and this is my first post / first question so here goes;
I'm modelling a simple heated pipe (40° uniform wall temp) with water flowing through inlet at 20° and 0.078 kg/s . Pipe is 0.01 m diam x 0.5 m long. I'm getting sensible outlet temperatures of about 25°, which i've verified by hand calcs which get me a theoretical heat flux density at wall of about 47000 W/m2 and a total input of 750 W using Gnielinski method which seem reasonable.
However a colleague of mine has written a script to extract average thermal flux density from the .txt files and this spits out about 4x10^6. Seems right but two orders of magnitude out, we've checked the script and all seems well plus double checked everything is in SI.
On top of that Paravis gives extremely high thermal flux at inlet and outlet, in the order of 10^8 and GMsh gives flux along wall from 81,000 to 189,000.
What am I missing here...???

Thanks in advance!
Please ask if you would like to see files, am not sure of best practices on that.