Run-time postprocessing
For pump performance calculation (and in many other cases as well), one usually needs data about pressure rise (head), flow, and power. If cavitation is an issue, maximum and minimum values for pressure come in handy, also maximum and minimum velocity can tell about convergence or numerical issues. All of these values are best calculated at runtime so it’s best to put the settings in controlDict.functionObjects.
// impeller forces for torque/power/viscous losses/axial and radial forces/mechanical losses forces_impeller { type forces; libs ("libforces.so"); patches (impeller_walls); rho rhoInf; rhoInf 985; // depends on temperature, also check viscosity in constant/transportProperties CofR (0 0 0); } // pressure difference between inlet and outlet for head calculation delta_p { type fieldValueDelta; libs ("libfieldFunctionObjects.so"); operation subtract; region1 { type surfaceFieldValue; operation average; fields (p); writeFields no; regionType patch; name outlet; } region2 { type surfaceFieldValue; operation average; fields (p); writeFields no; regionType patch; name inlet; } } // flow (is specified in 0/U but here it's added for an additional check) Q { type surfaceFieldValue; libs ("libfieldFunctionObjects.so"); writeFields false; surfaceFormat foam; regionType patch; name outlet; operation areaIntegrate; fields ( U ); } // maximum values: maximum velocity should be around impeller circumferential velocity // difference between minimum pressure and inlet pressure is NPSH maxValues { type fieldMinMax; libs ("libfieldFunctionObjects.so"); writeToFile no; log yes; location no; mode magnitude; fields (U p); }
Values are calculated after every time step and results are saved in postProcessing/ directory.
Post-postprocessing
There is also a handful of other quantities of interest like yPlus for mesh quality check and Q and vorticity for turbulence/losses visualisation. The problem with these quantities are that the whole field gets saved each timestep. If your writeInterval in controlDict is not 1 OpenFOAM makes a mess with timestep directories. Now every timestep has a directory, some only have yPlus and Q fields and others have all calculated quantities. This makes working with Paraview difficult so it’s better to postprocess these fields separately.
mpirun -np 4 pimpleFoam -parallel -postProcess -latestTime -func yPlus mpirun -np 4 pimpleFoam -parallel -postProcess -latestTime -func Q mpirun -np 4 pimpleFoam -parallel -postProcess -latestTime -func vorticity
The postProcessing/ directory
If for any reason the simulation is interrupted and then restarted from latestTime (check your controlDict), new files are added to postProcessing directory. Every added file gets an additional timestep timestamp added to its name. With many restarts the files multiply and make gathering data difficult.
Gathering data
This is a script for reading postprocessing data from all created files mentioned above. It gathers the values in a single Python dictionary and pickles it to a file. It can then be read in Paraview with programmable filter/source for visualization or read from another Python script for plotting and so on.
from subprocess import Popen, PIPE import os import re import pickle # case times times = [] # data to be written data = { # [time]: { # delta_p: [number], # Q: [number] # forces_impeller: [F_pressure, F_viscous, M_pressure, M_viscous] # } } for t in times: data[t] = {} def read_pp_data(pp_object, parse_function): for froot, _, fnames in os.walk(os.path.join('postProcessing', pp_object)): for fname in fnames: filename = os.path.join(froot, fname) if not os.path.isfile(filename): continue with open(filename, 'r') as f: print filename # skip lines with comment for line in f: if line[0] == '#': continue # timestep time = line.split()[0] # add to an existing entry or create a new one if time not in times: times.append(time) data[time] = {} # add data to collection data[time][pp_object] = parse_function(line) # parsing files: each value has different format split_re = re.compile('[ \t\n\r()]+') # split: whitespace, parenthesis def parse_delta_p(line): # delta_p function object values = line.split() return float(values[1]) def parse_Q(line): # surfaceFieldValue areaIntegrate (flow) return float(split_re.split(line)[2]) def parse_forces_impeller(line): # forces functionObject # forces format: # 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 # Time forces(pressure viscous porous) moments(pressure viscous porous) forces = split_re.split(line) # interested in: pressure and viscous forces and moments indexes = (1, 2, 3, 4, 5, 6, 10, 11, 12, 13, 14, 15) return [float(forces[i]) for i in indexes] read_pp_data('delta_p', parse_delta_p) read_pp_data('Q', parse_Q) read_pp_data('forces_impeller', parse_forces_impeller) # change text numbers to floats for k in data.keys(): data[float(k)] = data[k] del data[k] # save the data to be read in paraView (?) with open('postProcessing.pickle', 'wb') as f: pickle.dump(data, f, pickle.HIGHEST_PROTOCOL)
Plotting collected data
Loading a pickled dictionary is a piece of cake. I usually cut the first few results away since timesteps usually don’t converge and therefore values are unphysical.
import pickle import matplotlib.pyplot as plt import numpy as np import scipy.stats import collect_pp # import the above script to avoid running it manually # start only after transients decay t_start = 0.001 # here interesting data starts # pickled data format: # 0.000444: { # 'Q': -0.000222930392, # 'delta_p': 200.0, # 'forces_impeller': [-70.6219451, -0.307142376, 0.116242584, 0.0130158996, -0.0167682449, 0.00424048528, -0.320480923, -0.00105645189, -0.00185268198, -0.0417415834, -4.51203576e-05, -0.00011419315] # }, with open('postProcessing.pickle', 'rb') as f: data = pickle.load(f) # times times = data.keys() times.sort() n_start = np.searchsorted(np.array(times), t_start) times = times[n_start:] p_H = [] # head p_Q = [] # flow p_Mp = [] # pressure torque p_Mv = [] # viscous torque p_Fx = [] # axial force for t in times: try: p_H.append(data[t]['delta_p']) p_Q.append(-data[t]['Q']) Mp.append(-data[t]['forces_impeller'][6]) Mv.append(-data[t]['forces_impeller'][9]) Fx.append(data[t]['forces_impeller'][0]) # time p_times.append(t) except KeyError: continue
Filtering stuff
Ocassionally a timestep doesn’t converge and so calculated forces/moments/pressures are way too high or too low. That ruins the day for graphing and these values should be filtered. Actually, it would be better to find out why calculation doesn’t converge in the first place but I haven’t had the time yet for a proper ‘debugging’ session. So I use a little trim utility from scipy:
# filter out the outliers; weird unexplained results that occur randomly # first, determine which are they. def filter_data(values): # use trimmed values for average or it doesn't 'mean' anything avg = np.average(scipy.stats.trim_mean(values, 0.1)) return np.array(np.where(np.abs(values - avg) > abs(0.5*avg)))[0] i_filter = np.concatenate(( filter_data(p_H), filter_data(p_Q), filter_data(p_Mp), filter_data(p_Mv), filter_data(p_Fx))) i_filter = list(set(i_filter)) # delete all data with given indexes p_times = np.delete(p_times, i_filter) p_H = np.delete(p_H, i_filter) p_Q = np.delete(p_Q, i_filter) p_Mp = np.delete(p_Mp, i_filter) p_Mv = np.delete(p_Mv, i_filter) p_Fx = np.delete(p_Fx, i_filter)
Now you’re ready to do whatever with the data. Like plotting:
# plot results fig, ax1 = plt.subplots() ax1.plot(p_times, p_H, 'b-') ax1.set_xlabel('Time (s)') ax1.set_ylabel('H [dm]', color='b') ax1.tick_params('y', colors='b') ax2 = ax1.twinx() ax2.plot(p_times, p_Mp, 'r.') ax2.set_ylabel('M [Nm]', color='r') ax2.tick_params('y', colors='r')
Enjoy!