dedalus.core.evaluator

Class for centralized evaluation of expression trees and handling the results.

Module Contents

FILEHANDLER_MODE_DEFAULT
FILEHANDLER_PARALLEL_DEFAULT
FILEHANDLER_TOUCH_TMPFILE
logger
class Evaluator(dist, vars)

Coordinates evaluation of operator trees through various handlers.

Parameters:
  • dist (dist object) – Problem dist

  • vars (dict) – Variables for parsing task expression strings

add_dictionary_handler(**kw)

Create a dictionary handler and add to evaluator.

add_system_handler(**kw)

Create a system handler and add to evaluator.

add_file_handler(filename, parallel=None, **kw)

Create a file handler and add to evaluator.

add_handler(handler)

Add a handler to evaluator.

evaluate_group(group, **kw)

Evaluate all handlers in a group.

evaluate_scheduled(**kw)

Evaluate all scheduled handlers.

evaluate_handlers(handlers, id=None, **kw)

Evaluate a collection of handlers.

require_coeff_space(fields)

Move all fields to coefficient layout.

require_grid_space(fields)

Move all fields to grid layout.

static get_fields(tasks)

Get field set for a collection of tasks.

static attempt_tasks(tasks, **kw)

Attempt tasks and return the unfinished ones.

class Handler(dist, vars, group=None, wall_dt=None, sim_dt=None, iter=None, custom_schedule=None)

Group of tasks with associated evaluation schedule.

Parameters:
  • domain (domain object) – Problem domain

  • vars (dict) – Variables for parsing task expression strings

  • group (str, optional) – Group name for forcing selected handlers (default: None).

  • wall_dt (float, optional) – Wall time cadence for evaluating tasks (default: None).

  • sim_dt (float, optional) – Simulation time cadence for evaluating tasks (default: None).

  • iter (int, optional) – Iteration cadence for evaluating tasks (default: None).

  • custom_schedule (function, optional) – Custom scheduling function returning a boolean for triggering output (default: None). Signature for IVPs: custom_schedule(iteration, wall_time, sim_time, timestep) Signature for BVPs: custom_schedule(iteration)

check_schedule(**kw)
add_task(task, layout='g', name=None, scales=None)

Add task to handler.

add_tasks(tasks, **kw)

Add multiple tasks.

add_system(system, **kw)

Add fields from a FieldSystem.

class DictionaryHandler(*args, **kw)

Handler that stores outputs in a dictionary.

process(**kw)

Reference fields from dictionary.

class SystemHandler(dist, vars, group=None, wall_dt=None, sim_dt=None, iter=None, custom_schedule=None)

Handler that sets fields in a FieldSystem.

build_system()

Build FieldSystem and set task outputs.

process(**kw)

Gather fields into system.

class H5FileHandlerBase(base_path, *args, max_writes=None, mode=None, **kw)

Handler that writes tasks to an HDF5 file.

Parameters:
  • base_path (str) – Base path for analyis output folder

  • max_writes (int, optional) – Maximum number of writes per set. Default: None (infinite).

  • mode (str, optional) – ‘overwrite’ to delete any present analysis output with the same base path. ‘append’ to begin with set number incremented past any present analysis output. Default behavior set by config option.

property current_path
property current_file
add_task(*args, **kw)

Add task to handler.

get_data_distribution(task, rank=None)

Determine write parameters for a task.

get_file(**kw)

Return current HDF5 file, creating if necessary.

create_current_file()

Generate and setup new HDF5 file from root process.

setup_file(file)

Prepare new HDF5 file for writing.

create_task_dataset(file, task)

Create dataset for a task.

process(iteration, wall_time=0, sim_time=0, timestep=0)

Save task outputs to HDF5 file.

write_file_metadata(file, **kw)

Write file metadata and time scales.

abstract write_task(file, task)

Write task data.

abstract open_file(mode='r+')

Open current HDF5 file for processing.

abstract close_file(file)

Close current HDF5 file after processing.

class H5GatherFileHandler(base_path, *args, max_writes=None, mode=None, **kw)

H5FileHandler that gathers global data to write from root process.

open_file(mode='r+')

Open current HDF5 file for processing.

close_file(file)

Close current HDF5 file after processing.

write_file_metadata(file, **kw)

Write file metadata and time scales.

write_task(file, task)

Write task data.

class H5ParallelFileHandler(*args, **kw)

H5FileHandler using parallel HDF5 writes.

open_file(mode='r+')

Return current HDF5 file. Must already exist.

close_file(file)

Close current HDF5 file after processing.

create_current_file()

Generate and setup new HDF5 file.

create_task_dataset(file, task)

Create dataset for a task.

write_task(file, task)

Write task data.

get_hdf5_spaces(task, index)

Create HDF5 space objects for writing local portion of a task.

class H5VirtualFileHandler(base_path, *args, max_writes=None, mode=None, **kw)

H5FileHandler using process files and virtual joint files.

property current_process_file
empty()
open_file(mode='r+')

Open current HDF5 file for processing.

close_file(file)

Close current HDF5 file after processing.

create_current_file()

Generate and setup new HDF5 file.

create_task_dataset(file, task)

Create dataset for a task.

setup_process_file(file)

Prepare new HDF5 file for writing.

write_file_metadata(file, **kw)

Write file metadata and time scales.

write_task(file, task)

Write task data.

static merge_task(file, task_name, overwrite=False)

Merge virtual dataset into regular dataset.