Volume 2015
Permanent URI for this collectionhttp://edoc.hu-berlin.de/18452/372
Browse
Recent Submissions
Publication 2015-10-05BuchClustering of sample average approximation for stochastic programChen, Lijian; Higle, Julie L.; Römisch, Werner; Sen, SurrajeetWe present an improvement to the Sample Average Approximation (SAA) method for two-stage stochasticprogram. Although the SAA has nice theoretical properties, such as convergence in probability and consistency,as long as the sample is large enough, the requirement on the sample size is always a concern forboth academia and practitioners. Our clustering method employs the Maximum Volume Inscribed Ellipsoid(MVIE) to approximate the feasible set of each scenario and calculates a measure of similarity. The scenariosare clustered based on such a measure of similarity and our clustering method reduces the sample size considerably.Moreover, the clustering method will offer managerial implications by highlighting the matteringscenarios. The clustering method would be implemented in a distributed computational infrastructure withlow-cost computers.Publication 2015-09-16BuchRisk measures for vector-valued returnsPichler, Alois; Higle, Julie L.; Römisch, Werner; Sen, SurrajeetPortfolios, which are exposed to different currencies, have separate and different returns ineach individual currency and are thus vector-valued in a natural way.This paper investigates the natural domain of these risk measures. A Banach space is presented,for which the risk measure is continuous, and which reflects the vector-valued outcomesof the corresponding risk measures from mathematical finance. We develop its key properties and describe the corresponding duality theory. We finally outline extensions of this space, which are along classical Lp spaces.Publication 2015-10-16BuchParallel stochastic optimization based on descent algorithmsBilenne, Olivier; Higle, Julie L.; Römisch, Werner; Sen, SurrajeetThis study addresses the stochastic optimization of a function unknown in closed form which can only be estimated based on measurementsor simulations. We consider parallel implementations of a class of stochasticoptimization methods that consist of the iterative application of a descent algorithmto a sequence of approximation functions converging in some sense to the function of interest. After discussing classical parallel modes of implementations (Jacobi, Gauss-Seidel, random, Gauss-Southwell), we devise effort-savingimplementation modes where the pace of application of the considered descentalgorithm along individual coordinates is coordinated with the evolution of the estimated accuracy of the convergent function sequence. It is shown that this approach can be regarded as a Gauss-Southwell implementation of the initialmethod in an augmented space. As an example of application we study the distributed optimization of stochastic networks using a scaled gradient projection algorithm with approximate line search, for which asymptotic propertiesare derived.Publication 2015-09-14BuchConvergence of the Smoothed Empirical Process in Nested DistancePflug, Georg Ch.; Pichler, Alois; Higle, Julie L.; Römisch, Werner; Sen, SurrajeetThe nested distance, also process distance, provides a quantitative measure of distance for stochastic processes. It is the crucial and determining distance for stochastic optimization problems.In this paper we demonstrate first that the empirical measure, which is built from observed sample paths, does not converge in nested distance to its underlying distribution. We show that smoothing convolutions, which are appropriately adapted from classical density estimation using kernels, can be employed to modify the empirical measure in order to obtain stochastic processes, which converge in nested distance to the underlying process. We employ the results to estimate transition probabilities at each time moment. Finally we construct processes with discrete sample space from observed empirical paths, which approximate well the original stochastic process as they converge in nested distance.Publication 2015-05-12BuchStatistical Estimation of Composite Risk Functionals and Risk Optimization ProblemsDentcheva, Darinka; Penev, Spiridon; Ruszczynski, Andrzej; Higle, Julie L.; Römisch, Werner; Sen, SurrajeetWe address the statistical estimation of composite functionals whichmay be nonlinear in the probability measure. Our study is motivated bythe need to estimate coherent measures of risk, which become increasinglypopular in finance, insurance, and other areas associated with optimization under uncertainty and risk. We establish central limit formulae forcomposite risk functionals. Furthermore, we discuss the asymptotic behavior of optimization problems whose objectives are composite risk functionals and we establish a central limit formula of their optimal valueswhen an estimator of the risk functional is used. While the mathematicalstructures accommodate commonly used coherent measures of risk, theyhave more general character, which may be of independent interest.Publication 2015-04-22BuchA Comment on "Computational Complexityof Stochastic Programming Problems"Hanasusanto, Grani A.; Kuhn, Daniel; Wiesemann, Wolfram; Higle, Julie L.; Römisch, Werner; Sen, SurrajeetAlthough stochastic programming problems were always believed to be computationally chal-lenging, this perception has only recently received a theoretical justification by the seminal workof Dyer and Stougie (Mathematical Programming A, 106(3):423{432, 2006). Amongst others,that paper argues that linear two-stage stochastic programs with fixed recourse are #P-hard evenif the random problem data is governed by independent uniform distributions. We show thatDyer and Stougie's proof is not correct, and we offer a correction which establishes the strongerresult that even the approximate solution of such problems is #P-hard for a sufficiently highaccuracy. We also prove that the approximate solution of linear two-stage stochastic programswith random recourse is strongly #P-hard.Publication 2015-04-09BuchA Simulation Based Approach to Solve A Specific Type of Chance Constrained OptimizationChen, Lijian; Higle, Julie L.; Römisch, Werner; Sen, SurrajeetWe solve the chance constrained optimization with convexfeasible set through approximating the chance constraint by another convexsmooth function. The approximation is based on the numerical properties of theBernstein polynomial that is capable of effectively controlling the approximationerror for both function value and gradient. Thus we adopt a first-order algorithmto reach a satisfactory solution which is expected to be optimal. When theexplicit expression of joint distribution is not available, we then use Monte Carloapproach to numerically evaluate the chance constraint to obtain an optimalsolution by probability. Numerical results for known problem instances arepresented.