.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_gpc/plot_projection.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_gpc_plot_projection.py: Dimensionality reduction ======================== Introduction ^^^^^^^^^^^^ A large number of models show redundancies in the form of correlations between the parameters under investigation. Especially in the case of high-dimensional problems, the correlative behavior of the target variable as a function of the parameters is usually completely unknown. Correlating parameters can be combined into surrogate parameters by means of a principal component analysis, which enables a reduction of the effective parameter number. This significantly reduces the computational effort compared to the original problem with only minor losses in modeling accuracy. In this case, the :math:`n_d` original random variables :math:`\mathbf{\xi}` are reduced to a new set of :math:`n_d'` random variables :math:`\mathbf{\eta}`, where :math:`n_d'` this may result in additional function evaluations. Once the gradients have been calculated, however, the gPC coefficients can be computed with higher accuracy and less additional sampling points as it is described in the :ref:`Gradient enhanced gPC`. Accordingly, the choice of which method to select is (as usual) highly dependent on the underlying problem and its compression capabilities. It is noted that the projection matrix :math:`[\mathbf{W}]` has to be determined for each QoI separately. The projection approach is implemented in the following algorithms: * :ref:`Algorithm: StaticProjection` * :ref:`Algorithm: RegAdaptiveProjection` * :ref:`Algorithm: MEStaticProjection` * :ref:`Algorithm: MERegAdaptiveProjection` Example ^^^^^^^ Lets consider the following :math:`n_d` dimensional testfunction: .. math:: y = \cos \left( 2 \pi u + a\sum_{i=1}^{n_d}\xi_i \right) with :math:`u=0.5` and :math:`a=5.0`. Without loss of generality, we assume the two-dimensional case, i.e. :math:`n_d=2`, for now. This function can be expressed by only one random variable :math:`\eta`, which is a linear combination of the original random variables :math:`\xi`: .. math:: \eta = \sum_{i=1}^{n_d}\xi_i, This function is implemented in the :mod:`testfunctions ` module of pygpc in :class:`GenzOscillatory `. In the following, we will set up a static gPC with fixed order using the previously described projection approach to reduce the original dimensionality of the problem. .. GENERATED FROM PYTHON SOURCE LINES 114-116 Setting up the problem ---------------------- .. GENERATED FROM PYTHON SOURCE LINES 116-132 .. code-block:: default import pygpc from collections import OrderedDict # Loading the model and defining the problem # ------------------------------------------ # Define model model = pygpc.testfunctions.GenzOscillatory() # Define problem parameters = OrderedDict() parameters["x1"] = pygpc.Beta(pdf_shape=[1., 1.], pdf_limits=[0., 1.]) parameters["x2"] = pygpc.Beta(pdf_shape=[1., 1.], pdf_limits=[0., 1.]) problem = pygpc.Problem(model, parameters) .. GENERATED FROM PYTHON SOURCE LINES 133-135 Setting up the algorithm ------------------------ .. GENERATED FROM PYTHON SOURCE LINES 135-154 .. code-block:: default # gPC options options = dict() options["method"] = "reg" options["solver"] = "Moore-Penrose" options["settings"] = None options["interaction_order"] = 1 options["n_cpu"] = 0 options["error_type"] = "nrmsd" options["n_samples_validation"] = 1e3 options["error_norm"] = "relative" options["matrix_ratio"] = 2 options["qoi"] = 0 options["fn_results"] = 'tmp/staticprojection' options["save_session_format"] = ".pkl" options["grid"] = pygpc.Random options["grid_options"] = {"seed": 1} options["n_grid"] = 100 .. GENERATED FROM PYTHON SOURCE LINES 155-158 Since we have to compute the gradients of the solution anyway for the projection approach, we will make use of them also when determining the gPC coefficients. Therefore, we enable the "gradient_enhanced" gPC. For more details please see :ref:`Gradient enhanced gPC`. .. GENERATED FROM PYTHON SOURCE LINES 158-160 .. code-block:: default options["gradient_enhanced"] = True .. GENERATED FROM PYTHON SOURCE LINES 161-163 In the following we choose the :ref:`method to determine the gradients `. We will use a classical first order finite difference forward approximation for now. .. GENERATED FROM PYTHON SOURCE LINES 163-166 .. code-block:: default options["gradient_calculation"] = "FD_fwd" options["gradient_calculation_options"] = {"dx": 0.001, "distance_weight": -2} .. GENERATED FROM PYTHON SOURCE LINES 167-169 We will use a 10th order approximation. It is noted that the model will consist of only one random variable. Including the mean (0th order coefficient) there will be 11 gPC coefficients in total. .. GENERATED FROM PYTHON SOURCE LINES 169-172 .. code-block:: default options["order"] = [10] options["order_max"] = 10 .. GENERATED FROM PYTHON SOURCE LINES 173-174 Now we are defining the :class:`StaticProjection ` algorithm to solve the given problem. .. GENERATED FROM PYTHON SOURCE LINES 174-176 .. code-block:: default algorithm = pygpc.StaticProjection(problem=problem, options=options) .. GENERATED FROM PYTHON SOURCE LINES 177-179 Running the gpc --------------- .. GENERATED FROM PYTHON SOURCE LINES 179-186 .. code-block:: default # Initialize gPC Session session = pygpc.Session(algorithm=algorithm) # Run gPC algorithm session, coeffs, results = session.run() .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Creating initial grid () with n_grid=100 Determining gPC approximation for QOI #0: ========================================= Performing 100 simulations! It/Sub-it: 10/1 Performing simulation 001 from 100 [ ] 1.0% Total function evaluation: 0.00021028518676757812 sec It/Sub-it: 10/1 Performing simulation 001 from 200 [ ] 0.5% Gradient evaluation: 0.0012793540954589844 sec Determine gPC coefficients using 'Moore-Penrose' solver (gradient enhanced)... -> relative nrmsd error = 0.000693256699171336 .. GENERATED FROM PYTHON SOURCE LINES 187-191 Inspecting the gpc object ------------------------- The SVD of the gradients of the solution vector resulted in the following projection matrix reducing the problem from the two dimensional case to the one dimensional case: .. GENERATED FROM PYTHON SOURCE LINES 191-193 .. code-block:: default print(f"Projection matrix [W]: {session.gpc[0].p_matrix}") .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Projection matrix [W]: [[-0.70710678 -0.70710678]] .. GENERATED FROM PYTHON SOURCE LINES 194-198 It is of size :math:`n_d' \times n_d`, i.e. :math:`[1 \times 2]` in our case. Because of the simple sum of the random variables it can be seen directly from the projection matrix that the principal axis is 45° between the original parameter axes, exactly as the SVD of the gradient of the solution predicts. As a result, the number of gPC coefficients for a 10th order gPC approximation with only one random variable is 11: .. GENERATED FROM PYTHON SOURCE LINES 198-200 .. code-block:: default print(f"Number of gPC coefficients: {session.gpc[0].basis.n_basis}") .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Number of gPC coefficients: 11 .. GENERATED FROM PYTHON SOURCE LINES 201-202 Accordingly, the gPC matrix has 11 columns, one for each gPC coefficient: .. GENERATED FROM PYTHON SOURCE LINES 202-204 .. code-block:: default print(f"Size of gPC matrix: {session.gpc[0].gpc_matrix.shape}") .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Size of gPC matrix: (100, 11) .. GENERATED FROM PYTHON SOURCE LINES 205-209 It was mentioned previously that the one can make use of the :ref:`Gradient enhanced gPC` when using the projection approach. Internally, the gradients are also rotated and the gPC matrix is extended by the gPC matrix containing the derivatives: .. GENERATED FROM PYTHON SOURCE LINES 209-211 .. code-block:: default print(f"Size of gPC matrix containing the derivatives: {session.gpc[0].gpc_matrix_gradient.shape}") .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Size of gPC matrix containing the derivatives: (100, 11) .. GENERATED FROM PYTHON SOURCE LINES 212-213 The random variables of the original and the reduced problem can be reviewed in: .. GENERATED FROM PYTHON SOURCE LINES 213-216 .. code-block:: default print(f"Original random variables: {session.gpc[0].problem_original.parameters_random}") print(f"Reduced random variables: {session.gpc[0].problem.parameters_random}") .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Original random variables: OrderedDict([('x1', ), ('x2', )]) Reduced random variables: OrderedDict([('n0', )]) .. GENERATED FROM PYTHON SOURCE LINES 217-221 Postprocessing -------------- The post-processing works identical as in a standard gPC. The routines identify whether the problem is reduced and provide all sensitivity measures with respect to the original model parameters. .. GENERATED FROM PYTHON SOURCE LINES 221-243 .. code-block:: default # Post-process gPC and save sensitivity coefficients in .hdf5 file pygpc.get_sensitivities_hdf5(fn_gpc=options["fn_results"], output_idx=None, calc_sobol=True, calc_global_sens=True, calc_pdf=False, algorithm="sampling", n_samples=10000) # Get summary of sensitivity coefficients sobol, gsens = pygpc.get_sens_summary(fn_gpc=options["fn_results"], parameters_random=session.parameters_random, fn_out=None) print(f"\nSobol indices:") print(f"==============") print(sobol) print(f"\nGlobal average derivatives:") print(f"===========================") print(gsens) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none > Loading gpc session object: tmp/staticprojection.pkl > Loading gpc coeffs: tmp/staticprojection.hdf5 > Adding results to: tmp/staticprojection.hdf5 Sobol indices: ============== sobol_norm (qoi 0) ['x1', 'x2'] 0.855755 ['x2'] 0.076095 ['x1'] 0.075139 Global average derivatives: =========================== global_sens (qoi 0) x1 -0.151716 x2 -0.151716 .. GENERATED FROM PYTHON SOURCE LINES 244-251 Validation ---------- Validate gPC vs original model function (2D-surface) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ The possibility of parameter reduction becomes best clear if one visualizes the function values in the parameter space. In this simple example there is almost no difference between the original model (left) and the reduced gPC (center). .. GENERATED FROM PYTHON SOURCE LINES 251-261 .. code-block:: default pygpc.validate_gpc_plot(session=session, coeffs=coeffs, random_vars=list(problem.parameters_random.keys()), n_grid=[51, 51], output_idx=[0], fn_out=None, folder=None, n_cpu=session.n_cpu) .. image-sg:: /auto_gpc/images/sphx_glr_plot_projection_001.png :alt: Original model, gPC approximation, Difference (Original vs gPC) :srcset: /auto_gpc/images/sphx_glr_plot_projection_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 262-271 References ^^^^^^^^^^ .. [1] Tipireddy, R., & Ghanem, R. (2014). Basis adaptation in homogeneous chaos spaces. Journal of Computational Physics, 259, 304-317. .. [2] Tsilifis, P., Huan, X., Safta, C., Sargsyan, K., Lacaze, G., Oefelein, J. C., Najm, H. N., & Ghanem, R. G. (2019). Compressive sensing adaptation for polynomial chaos expansions. Journal of Computational Physics, 380, 29-47. .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 1.130 seconds) .. _sphx_glr_download_auto_gpc_plot_projection.py: .. only :: html .. container:: sphx-glr-footer :class: sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_projection.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_projection.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_