diff --git a/doc/source/_static/dpf_operators.html b/doc/source/_static/dpf_operators.html index 405c9485e75..dd68b7b6bd7 100644 --- a/doc/source/_static/dpf_operators.html +++ b/doc/source/_static/dpf_operators.html @@ -8369,9 +8369,7 @@

Configurating operators

Already merged fields containers support.

A vector of fields containers to merge or fields containers from pin 0 to ...

Weights to apply to each field from pin 1000 to ...

-

Outputs

Configurations

Scripting

Changelog

math: min/max over time

Description

Evaluates minimum/maximum over time/frequency.

-
Version 0.0.0

Inputs

Define min or max.

-

Outputs

Configurations

Scripting

Changelog

utility: merge fields containers

Description

Assembles a set of fields containers into a unique one.

+

Outputs

Configurations

Scripting

Changelog

utility: merge fields containers

Description

Assembles a set of fields containers into a unique one.

Version 0.0.0

Inputs

For some result files (such as RST), the scoping on names selection is duplicated through all the distributed files.If this pin is false, the merging process is skipped. If it is true, this scoping is merged. Default is true.

Default is false. If true, redundant quantities are summed instead of being ignored.

Already merged field support.

@@ -9306,7 +9304,7 @@

LaTeX

If connected, this pin allows you to extract the result only on the selected shell layer(s). The available values are: 0: Top, 1: Bottom, 2: TopBottom, 3: Mid, 4: TopBottomMid.

Compute mid nodes (when available) by averaging the neighbour corner nodes. Default: True

Outputs

Configurations

Scripting

Changelog

result: nodal rotational acceleration

Description

Read/compute nodal rotational acceleration by calling the readers defined by the datasources.

-
Version 0.0.0

Supported file types

Inputs

time/freq values (use doubles or field), time/freq set ids (use ints or scoping) or time/freq step ids (use scoping with TimeFreq_steps location) required in output. To specify time/freq values at specific load steps, put a Field (and not a list) in input with a scoping located on "TimeFreq_steps". Linear time freq intrapolation is performed if the values are not in the result files and the data at the max time or freq is taken when time/freqs are higher than available time/freqs in result files. To get all data for all time/freq sets, connect an int with value -1.

+
Version 0.0.0

Supported file types

Inputs

time/freq values (use doubles or field), time/freq set ids (use ints or scoping) or time/freq step ids (use scoping with TimeFreq_steps location) required in output. To specify time/freq values at specific load steps, put a Field (and not a list) in input with a scoping located on "TimeFreq_steps". Linear time freq intrapolation is performed if the values are not in the result files and the data at the max time or freq is taken when time/freqs are higher than available time/freqs in result files. To get all data for all time/freq sets, connect an int with value -1.

nodes or elements scoping required in output. The output fields will be scoped on these node or element IDs. To figure out the ordering of the fields data, look at their scoping IDs as they might not be ordered as the input scoping was. The scoping's location indicates whether nodes or elements are asked for. Using scopings container allows you to split the result fields container into domains

Fields container already allocated modified inplace

result file container allowed to be kept open to cache data

@@ -9954,7 +9952,7 @@

LaTeX

if true the field is rotated to global coordinate system (default true). Please check your results carefully if 'false' is used for Elemental or ElementalNodal results averaged to the Nodes when adjacent elements do not share the same coordinate system, as results may be incorrect.

prevents from reading the mesh in the result files

Outputs

Configurations

Scripting

Changelog

result: velocity

Description

Read/compute nodal velocities by calling the readers defined by the datasources.

-
Version 0.0.0

Supported file types

Inputs

time/freq values (use doubles or field), time/freq set ids (use ints or scoping) or time/freq step ids (use scoping with TimeFreq_steps location) required in output. To specify time/freq values at specific load steps, put a Field (and not a list) in input with a scoping located on "TimeFreq_steps". Linear time freq intrapolation is performed if the values are not in the result files and the data at the max time or freq is taken when time/freqs are higher than available time/freqs in result files. To get all data for all time/freq sets, connect an int with value -1.

+
Version 0.0.0

Supported file types

Inputs

time/freq values (use doubles or field), time/freq set ids (use ints or scoping) or time/freq step ids (use scoping with TimeFreq_steps location) required in output. To specify time/freq values at specific load steps, put a Field (and not a list) in input with a scoping located on "TimeFreq_steps". Linear time freq intrapolation is performed if the values are not in the result files and the data at the max time or freq is taken when time/freqs are higher than available time/freqs in result files. To get all data for all time/freq sets, connect an int with value -1.

nodes or elements scoping required in output. The output fields will be scoped on these node or element IDs. To figure out the ordering of the fields data, look at their scoping IDs as they might not be ordered as the input scoping was. The scoping's location indicates whether nodes or elements are asked for. Using scopings container allows you to split the result fields container into domains

Fields container already allocated modified inplace

result file container allowed to be kept open to cache data

@@ -9991,7 +9989,7 @@

LaTeX

result file path container, used if no streams are set

(LSDyna) Unit System ID (int), semicolon-separated list of base unit strings (str) or UnitSystem instance

Outputs

Configurations

Scripting

Changelog

result: acceleration

Description

Read/compute nodal accelerations by calling the readers defined by the datasources.

-
Version 0.0.0

Supported file types

Inputs

time/freq values (use doubles or field), time/freq set ids (use ints or scoping) or time/freq step ids (use scoping with TimeFreq_steps location) required in output. To specify time/freq values at specific load steps, put a Field (and not a list) in input with a scoping located on "TimeFreq_steps". Linear time freq intrapolation is performed if the values are not in the result files and the data at the max time or freq is taken when time/freqs are higher than available time/freqs in result files. To get all data for all time/freq sets, connect an int with value -1.

+
Version 0.0.0

Supported file types

Inputs

time/freq values (use doubles or field), time/freq set ids (use ints or scoping) or time/freq step ids (use scoping with TimeFreq_steps location) required in output. To specify time/freq values at specific load steps, put a Field (and not a list) in input with a scoping located on "TimeFreq_steps". Linear time freq intrapolation is performed if the values are not in the result files and the data at the max time or freq is taken when time/freqs are higher than available time/freqs in result files. To get all data for all time/freq sets, connect an int with value -1.

nodes or elements scoping required in output. The output fields will be scoped on these node or element IDs. To figure out the ordering of the fields data, look at their scoping IDs as they might not be ordered as the input scoping was. The scoping's location indicates whether nodes or elements are asked for. Using scopings container allows you to split the result fields container into domains

Fields container already allocated modified inplace

result file container allowed to be kept open to cache data

@@ -11090,7 +11088,7 @@

LaTeX

component priority table (vector of int)

if true, uses scoping to sort the field (default is false)

Outputs

Configurations

Scripting

Changelog

result: nodal rotation

Description

Read/compute nodal rotation by calling the readers defined by the datasources.

-
Version 0.0.0

Supported file types

Inputs

time/freq values (use doubles or field), time/freq set ids (use ints or scoping) or time/freq step ids (use scoping with TimeFreq_steps location) required in output. To specify time/freq values at specific load steps, put a Field (and not a list) in input with a scoping located on "TimeFreq_steps". Linear time freq intrapolation is performed if the values are not in the result files and the data at the max time or freq is taken when time/freqs are higher than available time/freqs in result files. To get all data for all time/freq sets, connect an int with value -1.

+
Version 0.0.0

Supported file types

Inputs

time/freq values (use doubles or field), time/freq set ids (use ints or scoping) or time/freq step ids (use scoping with TimeFreq_steps location) required in output. To specify time/freq values at specific load steps, put a Field (and not a list) in input with a scoping located on "TimeFreq_steps". Linear time freq intrapolation is performed if the values are not in the result files and the data at the max time or freq is taken when time/freqs are higher than available time/freqs in result files. To get all data for all time/freq sets, connect an int with value -1.

nodes or elements scoping required in output. The output fields will be scoped on these node or element IDs. To figure out the ordering of the fields data, look at their scoping IDs as they might not be ordered as the input scoping was. The scoping's location indicates whether nodes or elements are asked for. Using scopings container allows you to split the result fields container into domains

Fields container already allocated modified inplace

result file container allowed to be kept open to cache data

@@ -11157,7 +11155,7 @@

LaTeX

if 1, cyclic symmetry is ignored. If 2, cyclic expansion is done (default is 1).

region id (integer) or vector of region ids (vector) or region scoping (scoping) of the model (region corresponds to zone for Fluid results or part for LSDyna results).

Outputs

Configurations

Scripting

Changelog

result: nodal rotational velocity

Description

Read/compute nodal rotational velocity by calling the readers defined by the datasources.

-
Version 0.0.0

Supported file types

Inputs

time/freq values (use doubles or field), time/freq set ids (use ints or scoping) or time/freq step ids (use scoping with TimeFreq_steps location) required in output. To specify time/freq values at specific load steps, put a Field (and not a list) in input with a scoping located on "TimeFreq_steps". Linear time freq intrapolation is performed if the values are not in the result files and the data at the max time or freq is taken when time/freqs are higher than available time/freqs in result files. To get all data for all time/freq sets, connect an int with value -1.

+
Version 0.0.0

Supported file types

Inputs

time/freq values (use doubles or field), time/freq set ids (use ints or scoping) or time/freq step ids (use scoping with TimeFreq_steps location) required in output. To specify time/freq values at specific load steps, put a Field (and not a list) in input with a scoping located on "TimeFreq_steps". Linear time freq intrapolation is performed if the values are not in the result files and the data at the max time or freq is taken when time/freqs are higher than available time/freqs in result files. To get all data for all time/freq sets, connect an int with value -1.

nodes or elements scoping required in output. The output fields will be scoped on these node or element IDs. To figure out the ordering of the fields data, look at their scoping IDs as they might not be ordered as the input scoping was. The scoping's location indicates whether nodes or elements are asked for. Using scopings container allows you to split the result fields container into domains

Fields container already allocated modified inplace

result file container allowed to be kept open to cache data

@@ -14743,7 +14741,45 @@

LaTeX

Outputs

Configurations

Scripting

Changelog

utility: merge collections

Description

Merges a set of collections into a unique one.

Version 0.0.0

Inputs

a vector of collections to merge or collections from pin 0 to ...

Outputs

Configurations

Scripting

Changelog

min_max: min max by entity

Description

Compute the entity-wise minimum (out 0) and maximum (out 1) through all fields of a fields container.

-
Version 0.0.0

Inputs

Outputs

Configurations

Scripting

Changelog

result: von mises stresses as mechanical

Description

Computes the equivalent (Von Mises) stresses and averages it to the nodes (by default). For multibody simulations, averaging across bodies can either be activated or deactivated.

+
Version 0.0.0

Inputs

Outputs

Configurations

Scripting

Changelog

utility: concatenate fields

Description

Concatenates fields into a unique one by incrementing the number of components.

+

Example:

+
    +
  • Field1 components: { UX, UY, UZ }
  • +
  • Field2 components: { RX, RY, RZ }
  • +
  • Output field : { UX, UY, UZ, RX, RY, RZ }
  • +
+
Version 0.1.0

Inputs

Value used to fill the missing values when scopings are different. Default is 0.

+

Pin of the field of which to take the scoping for the output field. +If -1 all scopings will be merged, if -2 all scopings will be intersected. Default is -1

+

Support of the output field.

+

A vector of fields to merge from pin 0 to ...

+

Outputs

Field which has as many components as the sum of all the input fields' numbers of components.

+

Configurations

Scripting

Changelog

utility: concatenate fields containers

Description

Concatenates fields containers into a unique one by concatenating each of their fields.

+

Example:

+
    +
  • Fields Container 1:
      +
    • Field1 with components: { UX, UY, UZ }
    • +
    • Field2 with components: { VX, VY, VZ }
    • +
    +
  • +
  • Fields Container 2:
      +
    • Field1 with components: { RX, RY, RZ }
    • +
    • Field2 with components: { AX, AY, AZ }
    • +
    +
  • +
  • Output Fields Container:
      +
    • Field1 with components: { UX, UY, UZ, RX, RY, RZ }
    • +
    • Field2 with components: { VX, VY, VZ, AX, AY, AZ }
    • +
    +
  • +
+
Version 0.1.0

Inputs

Value used to fill the missing values when scopings are different. Default is 0.

+

Pin of the field of which to take the scoping for the output field. +If -1 all scopings will be merged, if -2 all scopings will be intersected. Default is -1.

+

Support of the output fields container's fields. By default each field has the support of the corresponding field of the first fields container.

+

A vector of fields containers to merge from pin 0 to ...

+

Outputs

Fields containers with fields which have as many components as the sum of all the input fields' numbers of components of the same index.

+

Configurations

Scripting

Changelog

result: von mises stresses as mechanical

Description

Computes the equivalent (Von Mises) stresses and averages it to the nodes (by default). For multibody simulations, averaging across bodies can either be activated or deactivated.

Version 0.0.0

Inputs

time/freq (use doubles or field), time/freq set ids (use ints or scoping) or time/freq step ids use scoping with TimeFreq_steps location) required in output.

nodes or elements scoping required in output.

result file container allowed to be kept open to cache data.

@@ -15501,7 +15537,9 @@

LaTeX

Average the Elemental Nodal result to the requested location.

Field/or fields container containing only the displacement field (nodal). If none specified, read displacements from result file using the data_sources.

Outputs

The computed result fields container (elemental nodal).

-

Configurations

Scripting

Changelog

result: compute stress 2

Description

Computes the stress from an elastic strain field. compute_total_strain limitations are applicable for stress computation Get the 2nd principal component.

+

Configurations

Scripting

Changelog

math: min/max over time

Description

Evaluates minimum/maximum over time/frequency.

+
Version 0.0.0

Inputs

Define min or max.

+

Outputs

Configurations

Scripting

Changelog

result: compute stress 2

Description

Computes the stress from an elastic strain field. compute_total_strain limitations are applicable for stress computation Get the 2nd principal component.

Version 0.0.0

Inputs

The element scoping on which the result is computed.

Needed to get mesh and material ids. Optional if a data_sources have been connected.

Needed to get mesh and material ids. Optional if a streams_container have been connected.

diff --git a/src/ansys/dpf/core/operators/utility/__init__.py b/src/ansys/dpf/core/operators/utility/__init__.py index fca46ab9b2c..77f008f850d 100644 --- a/src/ansys/dpf/core/operators/utility/__init__.py +++ b/src/ansys/dpf/core/operators/utility/__init__.py @@ -7,6 +7,8 @@ from .change_location import change_location from .change_shell_layers import change_shell_layers from .compute_time_scoping import compute_time_scoping +from .concatenate_fields import concatenate_fields +from .concatenate_fields_containers import concatenate_fields_containers from .customtypefield_get_attribute import customtypefield_get_attribute from .cyclic_support_get_attribute import cyclic_support_get_attribute from .default_value import default_value diff --git a/src/ansys/dpf/core/operators/utility/concatenate_fields.py b/src/ansys/dpf/core/operators/utility/concatenate_fields.py new file mode 100644 index 00000000000..1ff46f6513c --- /dev/null +++ b/src/ansys/dpf/core/operators/utility/concatenate_fields.py @@ -0,0 +1,390 @@ +""" +concatenate_fields + +Autogenerated DPF operator classes. +""" + +from __future__ import annotations +from typing import TYPE_CHECKING + +from warnings import warn +from ansys.dpf.core.dpf_operator import Operator +from ansys.dpf.core.inputs import Input, _Inputs +from ansys.dpf.core.outputs import Output, _Outputs +from ansys.dpf.core.operators.specification import PinSpecification, Specification +from ansys.dpf.core.config import Config +from ansys.dpf.core.server_types import AnyServerType + +if TYPE_CHECKING: + from ansys.dpf.core.field import Field + + +class concatenate_fields(Operator): + r"""Concatenates fields into a unique one by incrementing the number of + components. + + Example: - Field1 components: { UX, UY, UZ } - Field2 components: { RX, + RY, RZ } - Output field : { UX, UY, UZ, RX, RY, RZ } + + + Inputs + ------ + rescoping_value: float, optional + Value used to fill the missing values when scopings are different. Default is 0. + reference_scoping_index: int, optional + Pin of the field of which to take the scoping for the output field. + If -1 all scopings will be merged, if -2 all scopings will be intersected. Default is -1 + field_support: AbstractFieldSupport, optional + Support of the output field. + fields1: + A vector of fields to merge from pin 0 to ... + fields2: + A vector of fields to merge from pin 0 to ... + + Outputs + ------- + merged_fields: Field + Field which has as many components as the sum of all the input fields' numbers of components. + + Examples + -------- + >>> from ansys.dpf import core as dpf + + >>> # Instantiate operator + >>> op = dpf.operators.utility.concatenate_fields() + + >>> # Make input connections + >>> my_rescoping_value = float() + >>> op.inputs.rescoping_value.connect(my_rescoping_value) + >>> my_reference_scoping_index = int() + >>> op.inputs.reference_scoping_index.connect(my_reference_scoping_index) + >>> my_field_support = dpf.AbstractFieldSupport() + >>> op.inputs.field_support.connect(my_field_support) + >>> my_fields1 = dpf.() + >>> op.inputs.fields1.connect(my_fields1) + >>> my_fields2 = dpf.() + >>> op.inputs.fields2.connect(my_fields2) + + >>> # Instantiate operator and connect inputs in one line + >>> op = dpf.operators.utility.concatenate_fields( + ... rescoping_value=my_rescoping_value, + ... reference_scoping_index=my_reference_scoping_index, + ... field_support=my_field_support, + ... fields1=my_fields1, + ... fields2=my_fields2, + ... ) + + >>> # Get output data + >>> result_merged_fields = op.outputs.merged_fields() + """ + + def __init__( + self, + rescoping_value=None, + reference_scoping_index=None, + field_support=None, + fields1=None, + fields2=None, + config=None, + server=None, + ): + super().__init__( + name="merge::concatenate_fields", + config=config, + server=server, + inputs_type=InputsConcatenateFields, + outputs_type=OutputsConcatenateFields, + ) + if rescoping_value is not None: + self.inputs.rescoping_value.connect(rescoping_value) + if reference_scoping_index is not None: + self.inputs.reference_scoping_index.connect(reference_scoping_index) + if field_support is not None: + self.inputs.field_support.connect(field_support) + if fields1 is not None: + self.inputs.fields1.connect(fields1) + if fields2 is not None: + self.inputs.fields2.connect(fields2) + + @staticmethod + def _spec() -> Specification: + description = r"""Concatenates fields into a unique one by incrementing the number of +components. + +Example: - Field1 components: { UX, UY, UZ } - Field2 components: { RX, +RY, RZ } - Output field : { UX, UY, UZ, RX, RY, RZ } +""" + spec = Specification( + description=description, + map_input_pin_spec={ + -3: PinSpecification( + name="rescoping_value", + type_names=["double"], + optional=True, + document=r"""Value used to fill the missing values when scopings are different. Default is 0.""", + ), + -2: PinSpecification( + name="reference_scoping_index", + type_names=["int32"], + optional=True, + document=r"""Pin of the field of which to take the scoping for the output field. +If -1 all scopings will be merged, if -2 all scopings will be intersected. Default is -1""", + ), + -1: PinSpecification( + name="field_support", + type_names=["abstract_field_support"], + optional=True, + document=r"""Support of the output field.""", + ), + 0: PinSpecification( + name="fields", + type_names=["any"], + optional=False, + document=r"""A vector of fields to merge from pin 0 to ...""", + ), + 1: PinSpecification( + name="fields", + type_names=["any"], + optional=False, + document=r"""A vector of fields to merge from pin 0 to ...""", + ), + }, + map_output_pin_spec={ + 0: PinSpecification( + name="merged_fields", + type_names=["field"], + optional=False, + document=r"""Field which has as many components as the sum of all the input fields' numbers of components.""", + ), + }, + ) + return spec + + @staticmethod + def default_config(server: AnyServerType = None) -> Config: + """Returns the default config of the operator. + + This config can then be changed to the user needs and be used to + instantiate the operator. The Configuration allows to customize + how the operation will be processed by the operator. + + Parameters + ---------- + server: + Server with channel connected to the remote or local instance. When + ``None``, attempts to use the global server. + + Returns + ------- + config: + A new Config instance equivalent to the default config for this operator. + """ + return Operator.default_config(name="merge::concatenate_fields", server=server) + + @property + def inputs(self) -> InputsConcatenateFields: + """Enables to connect inputs to the operator + + Returns + -------- + inputs: + An instance of InputsConcatenateFields. + """ + return self._inputs + + @property + def outputs(self) -> OutputsConcatenateFields: + """Enables to get outputs of the operator by evaluating it + + Returns + -------- + outputs: + An instance of OutputsConcatenateFields. + """ + return self._outputs + + +class InputsConcatenateFields(_Inputs): + """Intermediate class used to connect user inputs to + concatenate_fields operator. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields() + >>> my_rescoping_value = float() + >>> op.inputs.rescoping_value.connect(my_rescoping_value) + >>> my_reference_scoping_index = int() + >>> op.inputs.reference_scoping_index.connect(my_reference_scoping_index) + >>> my_field_support = dpf.AbstractFieldSupport() + >>> op.inputs.field_support.connect(my_field_support) + >>> my_fields1 = dpf.() + >>> op.inputs.fields1.connect(my_fields1) + >>> my_fields2 = dpf.() + >>> op.inputs.fields2.connect(my_fields2) + """ + + def __init__(self, op: Operator): + super().__init__(concatenate_fields._spec().inputs, op) + self._rescoping_value: Input[float] = Input( + concatenate_fields._spec().input_pin(-3), -3, op, -1 + ) + self._inputs.append(self._rescoping_value) + self._reference_scoping_index: Input[int] = Input( + concatenate_fields._spec().input_pin(-2), -2, op, -1 + ) + self._inputs.append(self._reference_scoping_index) + self._field_support: Input = Input( + concatenate_fields._spec().input_pin(-1), -1, op, -1 + ) + self._inputs.append(self._field_support) + self._fields1: Input = Input(concatenate_fields._spec().input_pin(0), 0, op, 0) + self._inputs.append(self._fields1) + self._fields2: Input = Input(concatenate_fields._spec().input_pin(1), 1, op, 1) + self._inputs.append(self._fields2) + + @property + def rescoping_value(self) -> Input[float]: + r"""Allows to connect rescoping_value input to the operator. + + Value used to fill the missing values when scopings are different. Default is 0. + + Returns + ------- + input: + An Input instance for this pin. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields() + >>> op.inputs.rescoping_value.connect(my_rescoping_value) + >>> # or + >>> op.inputs.rescoping_value(my_rescoping_value) + """ + return self._rescoping_value + + @property + def reference_scoping_index(self) -> Input[int]: + r"""Allows to connect reference_scoping_index input to the operator. + + Pin of the field of which to take the scoping for the output field. + If -1 all scopings will be merged, if -2 all scopings will be intersected. Default is -1 + + Returns + ------- + input: + An Input instance for this pin. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields() + >>> op.inputs.reference_scoping_index.connect(my_reference_scoping_index) + >>> # or + >>> op.inputs.reference_scoping_index(my_reference_scoping_index) + """ + return self._reference_scoping_index + + @property + def field_support(self) -> Input: + r"""Allows to connect field_support input to the operator. + + Support of the output field. + + Returns + ------- + input: + An Input instance for this pin. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields() + >>> op.inputs.field_support.connect(my_field_support) + >>> # or + >>> op.inputs.field_support(my_field_support) + """ + return self._field_support + + @property + def fields1(self) -> Input: + r"""Allows to connect fields1 input to the operator. + + A vector of fields to merge from pin 0 to ... + + Returns + ------- + input: + An Input instance for this pin. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields() + >>> op.inputs.fields1.connect(my_fields1) + >>> # or + >>> op.inputs.fields1(my_fields1) + """ + return self._fields1 + + @property + def fields2(self) -> Input: + r"""Allows to connect fields2 input to the operator. + + A vector of fields to merge from pin 0 to ... + + Returns + ------- + input: + An Input instance for this pin. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields() + >>> op.inputs.fields2.connect(my_fields2) + >>> # or + >>> op.inputs.fields2(my_fields2) + """ + return self._fields2 + + +class OutputsConcatenateFields(_Outputs): + """Intermediate class used to get outputs from + concatenate_fields operator. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields() + >>> # Connect inputs : op.inputs. ... + >>> result_merged_fields = op.outputs.merged_fields() + """ + + def __init__(self, op: Operator): + super().__init__(concatenate_fields._spec().outputs, op) + self._merged_fields: Output[Field] = Output( + concatenate_fields._spec().output_pin(0), 0, op + ) + self._outputs.append(self._merged_fields) + + @property + def merged_fields(self) -> Output[Field]: + r"""Allows to get merged_fields output of the operator + + Field which has as many components as the sum of all the input fields' numbers of components. + + Returns + ------- + output: + An Output instance for this pin. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields() + >>> # Get the output from op.outputs. ... + >>> result_merged_fields = op.outputs.merged_fields() + """ + return self._merged_fields diff --git a/src/ansys/dpf/core/operators/utility/concatenate_fields_containers.py b/src/ansys/dpf/core/operators/utility/concatenate_fields_containers.py new file mode 100644 index 00000000000..5de27141239 --- /dev/null +++ b/src/ansys/dpf/core/operators/utility/concatenate_fields_containers.py @@ -0,0 +1,402 @@ +""" +concatenate_fields_containers + +Autogenerated DPF operator classes. +""" + +from __future__ import annotations +from typing import TYPE_CHECKING + +from warnings import warn +from ansys.dpf.core.dpf_operator import Operator +from ansys.dpf.core.inputs import Input, _Inputs +from ansys.dpf.core.outputs import Output, _Outputs +from ansys.dpf.core.operators.specification import PinSpecification, Specification +from ansys.dpf.core.config import Config +from ansys.dpf.core.server_types import AnyServerType + +if TYPE_CHECKING: + from ansys.dpf.core.fields_container import FieldsContainer + + +class concatenate_fields_containers(Operator): + r"""Concatenates fields containers into a unique one by concatenating each + of their fields. + + Example: - Fields Container 1: - Field1 with components: { UX, UY, UZ } + - Field2 with components: { VX, VY, VZ } - Fields Container 2: - Field1 + with components: { RX, RY, RZ } - Field2 with components: { AX, AY, AZ } + - Output Fields Container: - Field1 with components: { UX, UY, UZ, RX, + RY, RZ } - Field2 with components: { VX, VY, VZ, AX, AY, AZ } + + + Inputs + ------ + rescoping_value: float, optional + Value used to fill the missing values when scopings are different. Default is 0. + reference_scoping_index: int, optional + Pin of the field of which to take the scoping for the output field. + If -1 all scopings will be merged, if -2 all scopings will be intersected. Default is -1. + field_support: AbstractFieldSupport, optional + Support of the output fields container's fields. By default each field has the support of the corresponding field of the first fields container. + fields_containers1: + A vector of fields containers to merge from pin 0 to ... + fields_containers2: + A vector of fields containers to merge from pin 0 to ... + + Outputs + ------- + merged_collections: FieldsContainer + Fields containers with fields which have as many components as the sum of all the input fields' numbers of components of the same index. + + Examples + -------- + >>> from ansys.dpf import core as dpf + + >>> # Instantiate operator + >>> op = dpf.operators.utility.concatenate_fields_containers() + + >>> # Make input connections + >>> my_rescoping_value = float() + >>> op.inputs.rescoping_value.connect(my_rescoping_value) + >>> my_reference_scoping_index = int() + >>> op.inputs.reference_scoping_index.connect(my_reference_scoping_index) + >>> my_field_support = dpf.AbstractFieldSupport() + >>> op.inputs.field_support.connect(my_field_support) + >>> my_fields_containers1 = dpf.() + >>> op.inputs.fields_containers1.connect(my_fields_containers1) + >>> my_fields_containers2 = dpf.() + >>> op.inputs.fields_containers2.connect(my_fields_containers2) + + >>> # Instantiate operator and connect inputs in one line + >>> op = dpf.operators.utility.concatenate_fields_containers( + ... rescoping_value=my_rescoping_value, + ... reference_scoping_index=my_reference_scoping_index, + ... field_support=my_field_support, + ... fields_containers1=my_fields_containers1, + ... fields_containers2=my_fields_containers2, + ... ) + + >>> # Get output data + >>> result_merged_collections = op.outputs.merged_collections() + """ + + def __init__( + self, + rescoping_value=None, + reference_scoping_index=None, + field_support=None, + fields_containers1=None, + fields_containers2=None, + config=None, + server=None, + ): + super().__init__( + name="merge::concatenate_fields_containers", + config=config, + server=server, + inputs_type=InputsConcatenateFieldsContainers, + outputs_type=OutputsConcatenateFieldsContainers, + ) + if rescoping_value is not None: + self.inputs.rescoping_value.connect(rescoping_value) + if reference_scoping_index is not None: + self.inputs.reference_scoping_index.connect(reference_scoping_index) + if field_support is not None: + self.inputs.field_support.connect(field_support) + if fields_containers1 is not None: + self.inputs.fields_containers1.connect(fields_containers1) + if fields_containers2 is not None: + self.inputs.fields_containers2.connect(fields_containers2) + + @staticmethod + def _spec() -> Specification: + description = r"""Concatenates fields containers into a unique one by concatenating each +of their fields. + +Example: - Fields Container 1: - Field1 with components: { UX, UY, UZ } +- Field2 with components: { VX, VY, VZ } - Fields Container 2: - Field1 +with components: { RX, RY, RZ } - Field2 with components: { AX, AY, AZ } +- Output Fields Container: - Field1 with components: { UX, UY, UZ, RX, +RY, RZ } - Field2 with components: { VX, VY, VZ, AX, AY, AZ } +""" + spec = Specification( + description=description, + map_input_pin_spec={ + -3: PinSpecification( + name="rescoping_value", + type_names=["double"], + optional=True, + document=r"""Value used to fill the missing values when scopings are different. Default is 0.""", + ), + -2: PinSpecification( + name="reference_scoping_index", + type_names=["int32"], + optional=True, + document=r"""Pin of the field of which to take the scoping for the output field. +If -1 all scopings will be merged, if -2 all scopings will be intersected. Default is -1.""", + ), + -1: PinSpecification( + name="field_support", + type_names=["abstract_field_support"], + optional=True, + document=r"""Support of the output fields container's fields. By default each field has the support of the corresponding field of the first fields container.""", + ), + 0: PinSpecification( + name="fields_containers", + type_names=["any"], + optional=False, + document=r"""A vector of fields containers to merge from pin 0 to ...""", + ), + 1: PinSpecification( + name="fields_containers", + type_names=["any"], + optional=False, + document=r"""A vector of fields containers to merge from pin 0 to ...""", + ), + }, + map_output_pin_spec={ + 0: PinSpecification( + name="merged_collections", + type_names=["fields_container"], + optional=False, + document=r"""Fields containers with fields which have as many components as the sum of all the input fields' numbers of components of the same index.""", + ), + }, + ) + return spec + + @staticmethod + def default_config(server: AnyServerType = None) -> Config: + """Returns the default config of the operator. + + This config can then be changed to the user needs and be used to + instantiate the operator. The Configuration allows to customize + how the operation will be processed by the operator. + + Parameters + ---------- + server: + Server with channel connected to the remote or local instance. When + ``None``, attempts to use the global server. + + Returns + ------- + config: + A new Config instance equivalent to the default config for this operator. + """ + return Operator.default_config( + name="merge::concatenate_fields_containers", server=server + ) + + @property + def inputs(self) -> InputsConcatenateFieldsContainers: + """Enables to connect inputs to the operator + + Returns + -------- + inputs: + An instance of InputsConcatenateFieldsContainers. + """ + return self._inputs + + @property + def outputs(self) -> OutputsConcatenateFieldsContainers: + """Enables to get outputs of the operator by evaluating it + + Returns + -------- + outputs: + An instance of OutputsConcatenateFieldsContainers. + """ + return self._outputs + + +class InputsConcatenateFieldsContainers(_Inputs): + """Intermediate class used to connect user inputs to + concatenate_fields_containers operator. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields_containers() + >>> my_rescoping_value = float() + >>> op.inputs.rescoping_value.connect(my_rescoping_value) + >>> my_reference_scoping_index = int() + >>> op.inputs.reference_scoping_index.connect(my_reference_scoping_index) + >>> my_field_support = dpf.AbstractFieldSupport() + >>> op.inputs.field_support.connect(my_field_support) + >>> my_fields_containers1 = dpf.() + >>> op.inputs.fields_containers1.connect(my_fields_containers1) + >>> my_fields_containers2 = dpf.() + >>> op.inputs.fields_containers2.connect(my_fields_containers2) + """ + + def __init__(self, op: Operator): + super().__init__(concatenate_fields_containers._spec().inputs, op) + self._rescoping_value: Input[float] = Input( + concatenate_fields_containers._spec().input_pin(-3), -3, op, -1 + ) + self._inputs.append(self._rescoping_value) + self._reference_scoping_index: Input[int] = Input( + concatenate_fields_containers._spec().input_pin(-2), -2, op, -1 + ) + self._inputs.append(self._reference_scoping_index) + self._field_support: Input = Input( + concatenate_fields_containers._spec().input_pin(-1), -1, op, -1 + ) + self._inputs.append(self._field_support) + self._fields_containers1: Input = Input( + concatenate_fields_containers._spec().input_pin(0), 0, op, 0 + ) + self._inputs.append(self._fields_containers1) + self._fields_containers2: Input = Input( + concatenate_fields_containers._spec().input_pin(1), 1, op, 1 + ) + self._inputs.append(self._fields_containers2) + + @property + def rescoping_value(self) -> Input[float]: + r"""Allows to connect rescoping_value input to the operator. + + Value used to fill the missing values when scopings are different. Default is 0. + + Returns + ------- + input: + An Input instance for this pin. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields_containers() + >>> op.inputs.rescoping_value.connect(my_rescoping_value) + >>> # or + >>> op.inputs.rescoping_value(my_rescoping_value) + """ + return self._rescoping_value + + @property + def reference_scoping_index(self) -> Input[int]: + r"""Allows to connect reference_scoping_index input to the operator. + + Pin of the field of which to take the scoping for the output field. + If -1 all scopings will be merged, if -2 all scopings will be intersected. Default is -1. + + Returns + ------- + input: + An Input instance for this pin. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields_containers() + >>> op.inputs.reference_scoping_index.connect(my_reference_scoping_index) + >>> # or + >>> op.inputs.reference_scoping_index(my_reference_scoping_index) + """ + return self._reference_scoping_index + + @property + def field_support(self) -> Input: + r"""Allows to connect field_support input to the operator. + + Support of the output fields container's fields. By default each field has the support of the corresponding field of the first fields container. + + Returns + ------- + input: + An Input instance for this pin. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields_containers() + >>> op.inputs.field_support.connect(my_field_support) + >>> # or + >>> op.inputs.field_support(my_field_support) + """ + return self._field_support + + @property + def fields_containers1(self) -> Input: + r"""Allows to connect fields_containers1 input to the operator. + + A vector of fields containers to merge from pin 0 to ... + + Returns + ------- + input: + An Input instance for this pin. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields_containers() + >>> op.inputs.fields_containers1.connect(my_fields_containers1) + >>> # or + >>> op.inputs.fields_containers1(my_fields_containers1) + """ + return self._fields_containers1 + + @property + def fields_containers2(self) -> Input: + r"""Allows to connect fields_containers2 input to the operator. + + A vector of fields containers to merge from pin 0 to ... + + Returns + ------- + input: + An Input instance for this pin. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields_containers() + >>> op.inputs.fields_containers2.connect(my_fields_containers2) + >>> # or + >>> op.inputs.fields_containers2(my_fields_containers2) + """ + return self._fields_containers2 + + +class OutputsConcatenateFieldsContainers(_Outputs): + """Intermediate class used to get outputs from + concatenate_fields_containers operator. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields_containers() + >>> # Connect inputs : op.inputs. ... + >>> result_merged_collections = op.outputs.merged_collections() + """ + + def __init__(self, op: Operator): + super().__init__(concatenate_fields_containers._spec().outputs, op) + self._merged_collections: Output[FieldsContainer] = Output( + concatenate_fields_containers._spec().output_pin(0), 0, op + ) + self._outputs.append(self._merged_collections) + + @property + def merged_collections(self) -> Output[FieldsContainer]: + r"""Allows to get merged_collections output of the operator + + Fields containers with fields which have as many components as the sum of all the input fields' numbers of components of the same index. + + Returns + ------- + output: + An Output instance for this pin. + + Examples + -------- + >>> from ansys.dpf import core as dpf + >>> op = dpf.operators.utility.concatenate_fields_containers() + >>> # Get the output from op.outputs. ... + >>> result_merged_collections = op.outputs.merged_collections() + """ + return self._merged_collections diff --git a/src/ansys/dpf/gatebin/Ans.Dpf.GrpcClient.dll b/src/ansys/dpf/gatebin/Ans.Dpf.GrpcClient.dll index 18bfa03d51a..545647d729e 100644 Binary files a/src/ansys/dpf/gatebin/Ans.Dpf.GrpcClient.dll and b/src/ansys/dpf/gatebin/Ans.Dpf.GrpcClient.dll differ diff --git a/src/ansys/dpf/gatebin/DPFClientAPI.dll b/src/ansys/dpf/gatebin/DPFClientAPI.dll index e0fee048613..1f69baabec7 100644 Binary files a/src/ansys/dpf/gatebin/DPFClientAPI.dll and b/src/ansys/dpf/gatebin/DPFClientAPI.dll differ