In general, structural uncertainties can be represented using random variables or stochastic processes within the framework of the probability theory. More specifically, scalar quantities such as the cross sectional dimensions of a beam can be modelled using random variables, while stochastic processes can be seen as a collection of finite number of random variables. Hence, these can be employed to capture the variations of structural properties with respect to time or space. For example, "random fields" are often used to model spatially variying random properties over a domain, e.g. Young's modulus of a plate. Stochastic processes changing with respect to time, on the other hand, can be used to represent complex loadings such as earthquake excitations or wind loadings.

*Key Features*

*
*

- 16 continuous and 6 discrete distribution types
- Possibility to create user defined distributions
- Correlated multivariate distributions
- Gaussian mixture multivariate distributions
- Easy and interactive definition of input parameters
- Lively updated PDF/CDF plots
- Definition of stochastic processes

Reliability analysis aims to simulate and ensure the performance of a structure or component, i.e. that the envisioned tasks are performed by the design efficiently over its life time. Consequently, the ultimate goal in this context is mostly to estimate the probability of failure for a predefined failure event, as well as the time of its occurence. In certain cases, the failure can be also defined by different combinations of various events, leading to the so called "system reliability" analysis. It should be noted however that in general engineering applications exhibit a high degree of complexity, which prevents analytical solutions for this type of analysis. The main reason for this is that real-life problems are mostly dealt with high fidelity FE models. In other words, a full FE analysis is required to calculate the response of a given structure. Moreover, since the assessment of reliability implies many consequtive simulations, it is clear that the computational cost is an important issue in this regard.

In order to overcome these difficulties, COSSAN-X is facilitated with the most recent and advanced simulation methods, which reduce the number of required simulations significantly. This is very essential for the practical applications, mainly due to the fact that in general very low failure probabilities are required within engineering designs. Furthermore, the reliability toolbox provides the necessary visualization tools, e.g. histograms, parallel coordinates view or scatter plots, so that the outcomes of the analysis can be interpreted by the users in a most efficient and convenient way.

*Key Features*

*
*

- Advanced simulation methods such as Importance sampling, Line sampling and Subset simulation
- Quasi Monte-Carlo algorithms such as Latin Hypercube sampling and Sobol sampling
- System Reliability Analysis and Minimum cut set identification

In todays engineering practice, structural optimization is almost an indispensible step of the design cycle for any product or component. By means of optimizion, engineers can reach significant reductions in terms of the manufacturing and operating costs, as well as the improvement in the performance. Consequently, the field of optimization has been very active in terms of research and innovation, where new methods and advanced algorithms are constantly being developed and introduced.

In this regard, the optimization toolbox of COSSAN-X meets the high standards and provides a set of widely used gradient-based and gradient-free algorithms both for small and large-scale analysis, which can be adopted to solve real-life problems involving continuous or discrete design variables, multiple constraints and objective functions. The toolbox also provides the neccessary guidance and assisstance to its users for the selection of the most appropriate methods and input parameters for a given problem.

- A wide choice of algorithms specializing on different type of optimization problems, e.g. Genetic Algorithms, COBYLA, SQP, SIMPLEX, Simulated Annealing, Evolution strategies, Cross Entropy, etc.
- Easy and interactive definition of Design Variables, Constraints and Objective Functions
- Interaction with third party FE solvers
- Possibility to solve Reliability-based Optimization problems

In applications, where a costly numerical model is to be evaluated multiple times (such as in the case of simulation methods), the computational efforts might become infeasible. One way to reduce the analysis time in these cases is to use meta-models, which approximate the quantities of interests at low computational costs. In other words, meta-models (also referred to as "response surfaces") mimic the behavior of the original model (say a complex FE analysis), by means of an analytical expression with negligible computational cost. These surrogate models are mostly constructed based on a set of training samples like in the case of Artificial Neural Networks.Since the stochastic analysis methods require - especially for large and complex problems - substantial computational efforts, these meta-models are cruical for many application problems. Considering for example reliability based optimization, where reliability analysis has to be performed for every loop within the optimization cycle, their importance becomes more clear. As a result, COSSAN-X has been developed to offer the most widely used methods and techniques in this regard. Using the features of this toolbox, the practitioners can interactively train a meta-model to replace their complex FE model and calibrate it to a desired accuracy. As the next step, the constructed meta-model can be provided as an input to a different toolbox to be used within another analysis.

- Advanced meta modelling techniques such as Neural Networks or Polynomial Chaos expansion
- Straight forward construction of the meta-models using training, calibrating and validating features
- Possibility to employ the created meta-models within other toolboxes such as reliability analysis or reliability based optimization

Stochastic finite element methods extend the capabilities of the classical deterministic FE analysis in order to take the structural uncertainties into account. By utilizing these methods, one can quantify and propagate the unavoidable uncertainties in the structures. This information can be then processed to estimate the statistics of the responses of the structure at critical locations and to assess its reliability. It should be also noted that the interaction with the third party FE solvers is especially important for SFEM. In this respect, SFEM toolbox of COSSAN has been developed in particular to make use of the constantly developed and well maintained deterministic FE software with high solution and visualization capabilities.

- Intrusive implementations within widely used FE solvers such as NASTRAN, ABAQUS and ANSYS
- Optimized data transfer between the COSSAN-X and third party FE solvers
- Integration of model reduction techniques for improved performance
- A list of most recent and widely used formulations such as Perturbation, Neumann Expansion and Polynomial Chaos Expansion
- Automatic adjustment of input parameters (e.g. drop tolerance or convergence threshold) for optimum efficiency
- Plug-ins embedded into third party pre-/postprocessors

Sensitivity toolbox allows to study the relationship between the input and output parameters in a model and to identify the most significant variables affecting the response. Consequently, sensitivity analysis is used in particular for model calibration, model validation, decision making process purposes, i.e. where it is crucial to identify the important parameters, which contribute mostly to the output variability. In this toolbox, COSSAN-X offers various algorithms within local sensitivity analysis, global sensitivity analysis and screening methods. More specifically, while the local sensitivity analysis provide information about the system behaviour around a selected point in the input domain, the global sensitivity analysis techniques take the entire range of the input parameters into account. COSSAN-X also provides the necessary visualization tools, such as bar & pie charts for the most convenient interpretation of the analysis outcome.

- Local sensitivity analysis based on gradient estimation
- Simulation based sensitivity analysis
- Random Balance Design
- Calculation of Sobol' indices, total indices and upper bounds

Since the stochastic analysis methods are in general computationally demanding and require multiple consequtive analysis, to decrease the overall analysis time by exploiting the parallel computation methods is very crucial. In this context, these algorithms are expected take advantage of the availability of large number of cores and the heterogeneity of the grid computing, i.e. submitting a specific task to a specific hardware in order to reduce the wall-clock time significantly. COSSAN-X offers the possibility to perform stochastic analysis adopting the immense computational power and the great opportunity provided by grid computing. In fact, grid computing offers more flexibility at relatively low costs than the traditional parallel execution. Furthermore, by interfacing with grid managers it is possible to distribute the execution of the analysis on the available (remote) resources on a computer grid and to maximize the use of the available licenses, while reducing the execution time (wall clock time) of the analysis task.

- CPU & GPU computing
- Oracle Grid Engine integration
- Licence Optimization