Grab samples were taken upstream, downstream and at the effluent of 24 Swiss WWTPs during low flow conditions across independent catchments with different land uses.

Important pages

As more high quality training data becomes available, the improvements in machine learning methods will likely continue, but the alternative approaches still provide valuable complementary information. Im Projekt EcoImpact wurden die ökotoxikologischen und ökologischen Effekte von Mikroverunreinigungen aus Kläranlagen auf aquatische Lebensgemeinschaften untersucht. Chemische und biologische Untersuchungen oberhalb und unterhalb von Kläranlagen deuten auf Effekte dieser Stoffe hin, die von physiologischen Antworten der Organismen bis hin zu veränderten Ökosystemfunktionen wie z.

Gezielte Rinnenexperimente mit kontrollierter Wasserqualität unterstützen diese Befunde. Phytoplankton constitute an important component of surface water ecosystems; however little is known about their contribution to biotransformation of organic micropollutants. To elucidate biotransformation processes, batch experiments with two cyanobacterial species Microcystis aeruginosa and Synechococcus sp. Twenty-four micropollutants were studied, including 15 fungicides and 9 pharmaceuticals.

The observed transformation pathways included reactions likely mediated by promiscuous enzymes, such as glutamate conjugation to mefenamic acid and pterin conjugation of sulfamethoxazole. For 15 compounds, including all azole fungicides tested, no TPs were identified. Environmentally relevant concentrations of chemical stressors had no influence on the transformation types and rates. Processes and Impacts , 19 6 , , doi: The overarching aim of this field study was to examine causal links between in-situ exposure to complex mixtures of micropollutants from wastewater treatment plants and effects on freshwater microbial communities in the receiving streams.

To reach this goal, we assessed the toxicity of serial dilutions of micropollutant mixtures, extracted from deployed passive samplers at the discharge sites of four Swiss wastewater treatment plants, to in situ periphyton from upstream and downstream of the effluents. On the one hand, comparison of the sensitivities of upstream and downstream periphyton to the micropollutant mixtures indicated that algal and bacterial communities composing the periphyton displayed higher tolerance towards these micropollutants downstream than upstream.

On the other hand, molecular analyses of the algal and bacterial structure showed a clear separation between upstream and downstream periphyton across the sites. This finding provides an additional line of evidence that micropollutants from the wastewater discharges were directly responsible for the change in the community structure at the sampling sites by eliminating the micropollutant-sensitive species and favouring the tolerant ones. What is more, the fold increase of algal and bacterial tolerance from upstream to downstream locations was variable among sampling sites and was strongly correlated to the intensity of contamination by micropollutants at the respective sites.

Overall, our study highlights the sensitivity of the proposed approach to disentangle effects of micropollutant mixtures from other environmental factors occurring in the field and, thus, establishing a causal link between exposure and the observed ecological effects on freshwater microbial communities. Im Hardwald bei Muttenz wird zur Trinkwasseraufbereitung Rheinwasser versickert.

Während der Bodenpassage wird rund die Hälfte der vorhandenen Spurenstoffe im Rheinwasser entfernt. Ein Grossteil der restlichen Substanzen wird durch die Aktivkohlefiltration des angereicherten Grundwassers entfernt.

Wie gut die Spurenstoffe im Aktivkohlefilter zurückgehalten werden und ob eine vor- oder nachgeschaltete Oxidation zu einer besseren Entfernung führen kann, wird in der nachfolgend vorgestellten Studie aufgezeigt. Citalopram CTR is a worldwide highly consumed antidepressant which has demonstrated incomplete removal by conventional wastewater treatment. Despite its global ubiquitous presence in different environmental compartments, little is known about its behaviour and transformation processes during wastewater treatment.

The present study aims to expand the knowledge on fate and transformation of CTR during the biological treatment process. For this purpose, batch reactors were set up to assess biotic, abiotic and sorption losses of this compound. One of the main objectives of the study was the identification of the formed transformation products TPs by applying suspect and non-target strategies based on liquid chromatography quadrupole-time-of-flight mass spectrometry LC-QTOF-MS.

In total, fourteen TPs were detected and thirteen of them were tentatively identified. Probable structures based on diagnostic evidence were proposed for the additional nine TPs.

Eleven TPs are reported for the first time. A transformation pathway for the biotransformation of CTR was proposed. The presence of the identified TPs was assessed in real wastewater samples through retrospective analysis, resulting in the detection of five compounds. Finally, the potential ecotoxicological risk posed by CTR and its TPs to different trophic levels of aquatic organisms was evaluated by means of risk quotients. Human-induced environmental changes are causing major shifts in ecosystems around the globe.

To support environmental management, scientific research has to infer both general trends and context dependency in these shifts at global and local scales. Combining replicated real-world experiments, which take advantage of implemented mitigation measures or other forms of human impact, with research-led experimental manipulations can provide powerful scientific tools for inferring causal drivers of ecological change and the generality of their effects.

Additionally, combining these two approaches can facilitate communication with stakeholders involved in implementing management strategies. We demonstrate such an integrative approach using the case study EcoImpact, which aims at empirically unravelling the impacts of wastewater-born micropollutants on aquatic ecosystems. Model Systems to Global Perspectives , , doi: The Dagstuhl Seminar on Computational Metabolomics brought together leading experimental analytical chemistry and biology and computational computer science and bioinformatics experts with the aim to foster the exchange of expertise needed to advance computationalmetabolomics.

The focus was on a dynamic schedule with overview talks followed by breakoutsessions, selected by the participants, covering the whole experimental-computational continuumin mass spectrometry, as well as the use of metabolomics data in applications. A general observationwas that metabolomics is in the state that genomics was 20 years ago and that while theavailability of data is holding back progress, several good initiatives are present.

The importanceof small molecules to life should be communicated properly to assist initiating a global metabolomicsinitiative, such as the Human Genome project. Several follow-ups were discussed, includingworkshops, hackathons, joint paper s and a new Dagstuhl Seminar in two years to follow up onthis one.

Essener Tagung für Wasser und Abfallwirtschaft. The efficiency of wastewater ozonation for the abatement of three nitrogen-containing pharmaceuticals, two antihistamine drugs, cetirizine CTR and fexofenadine FXF , and the diuretic drug, hydrochlorothiazide HCTZ , was investigated.

All three compounds are very reactive with ozone apparent second order rate constants at pH 7: Transformation product TP structures were elucidated using liquid chromatography coupled with high-resolution tandem mass spectrometry, including isotope-labeled standards. For cetirizine and hydrochlorothiazide 8 TPs each and for fexofenadine 7 TPs were identified.

In the bacteria bioluminescence assay the toxicity was slightly increased only during the ozonation of cetirizine at very high cetirizine concentrations. Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health.

This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis EDA is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required, and provides practical advice on their application.

The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, including their strengths and weaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA.

Reduction of sample complexity mainly relies on fractionation procedures, which are discussed in this paper, including quality assurance and quality control.

Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determining the chemical structures causing effects is analytical toxicant identification.

The latest approaches, tools, software and databases for target-, suspect and non-target screening as well as unknown identification are discussed together with analytical and toxicological confirmation approaches.

A better understanding of optimal use and combination of EDA tools will help to design efficient and successful toxicant identification studies in the context of quality monitoring in multiply stressed environments.

The main removal process for polar organic micropollutants during activated sludge treatment is biotransformation, which often leads to the formation of stable transformation products TPs. Because the analysis of TPs is challenging, the use of pathway prediction systems can help by generating a list of suspected TPs.

To complete and refine pathway prediction, comprehensive biotransformation studies for compounds exhibiting pertinent functional groups under environmentally relevant conditions are needed. Because many polar organic micropollutants present in wastewater contain one or several amine functional groups, we systematically explored amine biotransformation by conducting experiments with 19 compounds that contained 25 structurally diverse primary, secondary, and tertiary amine moieties.

The identification of TP candidates and the structure elucidation of of these resulted in a comprehensive view on initial amine biotransformation reactions. The reactions with the highest relevance were N-oxidation, N-dealkylation, N-acetylation, and N-succinylation.

Whereas many of the observed reactions were similar to those known for the mammalian metabolism of amine-containing xenobiotics, some N-acylation reactions were not previously described.

In general, different reactions at the amine functional group occurred in parallel. Finally, recommendations on how these findings can be implemented to improve microbial pathway prediction of amine-containing micropollutants are given.

Oxygen isotope fractionation of molecular O 2 is an important process for the study of aerobic metabolism, photosynthesis, and formation of reactive oxygen species.

The latter is of particular interest for investigating the mechanism of enzyme-catalyzed reactions, such as the oxygenation of organic pollutants, which is an important detoxification mechanism.

After creating a N 2 headspace, the dissolved O 2 partitions from aqueous solution to the headspace, from which it can be injected into the gas chromatograph. The chromatographic separation of O 2 and N 2 with a molecular sieve column made it possible to use N 2 as the headspace gas for the extraction of dissolved O 2 from water.

The successful quantification of 18 O-kinetic isotope effects associated with enzymatic and chemical reduction of dissolved O 2 illustrates how the proposed method can be applied for studying enzymatic O 2 activation mechanisms in a variety of bio chemical processes.

The in silico fragmenter MetFrag, launched in , was one of the first approaches combining compound database searching and fragmentation prediction for small molecule identification from tandem mass spectrometry data.

Since then many new approaches have evolved, as has MetFrag itself. This article details the latest developments to MetFrag and its use in small molecule identification since the original publication. MetFrag has gone through algorithmic and scoring refinements.

New features include the retrieval of reference, data source and patent information via ChemSpider and PubChem web services, as well as InChIKey filtering to reduce candidate redundancy due to stereoisomerism. Retention time information can now be calculated either within MetFrag with a sufficient amount of user-provided retention times, or incorporated separately as "user-defined scores" to be included in candidate ranking.

Including reference and retention information in MetFrag2. Further examples are given to demonstrate flexibility of the enhanced features. In many cases additional information is available from the experimental context to add to small molecule identification, which is especially useful where the mass spectrum alone is not sufficient for candidate selection from a large number of candidates.

The results achieved with MetFrag2. The new functions greatly enhance the chance of identification success and have been incorporated into a command line interface in a flexible way designed to be integrated into high throughput workflows. Feedback on the command line version of MetFrag2. Online solid-phase extraction was combined with nano-liquid chromatography coupled to high-resolution mass spectrometry HRMS for the analysis of micropollutants in environmental samples from small volumes.

The method was validated in surface water, Microcystis aeruginosa cell lysate, and spent Microcystis growth medium. For 41 analytes, quantification limits of 0. In cell lysate, quantification limits ranged from 0. The method matches the sensitivity of established online and offline solid-phase extraction—liquid chromatography—mass spectrometry methods but requires only a fraction of the sample used by those techniques, and is among the first applications of nano-LC-MS for environmental analysis.

The method was applied to the determination of bioconcentration in Microcystis aeruginosa in a laboratory experiment, and the benefit of coupling to HRMS was demonstrated in a transformation product screening.

At present, mass spectrometry MS -based metabolomics has been widely used to obtain new insights into human, plant, and microbial biochemistry; drug and biomarker discovery; nutrition research; and food control.

Despite the high research interest, identifying and characterizing the structure of metabolites has become a major drawback for converting raw MS data into biological knowledge. Comprehensive and well-annotated MS-based spectral databases play a key role in serving this purpose via the formation of metabolite annotations. The main characteristics of the mass spectral databases currently used in MS-based metabolomics are reviewed in this study, underlining their advantages and limitations.

Finally, future prospects of mass spectral databases are discussed in terms of the needs posed by novel applications and instrumental advancements. Relative reasoning is supported for the refinement of predictions and to allow its extensions in terms of previously published, but not implemented machine learning models.

An RDF database is used to enable simple integration with other databases. During rain events, biocides and plant protection products are transported from agricultural fields but also from urban sources to surface waters. Originally designed to be biologically active, these compounds may harm organisms in aquatic ecosystems. Although several models allow either urban or agricultural storm events to be predicted, only few combine these two sources, and none of them include biocide losses from building envelopes.

This study therefore aims to develop a model designed to predict water and substance flows from urban and agricultural sources to surface waters. We developed a model based on physical principles for water percolation and substance flow including micro- also called matrix- and macropore-flows for the agricultural areas together with a model representing sources, sewer systems and a wastewater treatment plant for urban areas.

In a second step, the combined model was applied to a catchment where an extensive field study had been conducted. The modelled and measured discharge and compound results corresponded reasonably well in terms of quantity and dynamics. The total cumulative discharge was only slightly lower than the total measured discharge factor 0. The modelled urban losses of diuron from facades were within a factor of three with respect to the measured values.

The results highlighted the change in importance of the flow components during a rain event from urban sources during the most intensive rain period towards agricultural ones over a prolonged time period. Applications to two other catchments, one neighbouring and one on another continent showed that the model can be applied using site specific data for land use, pesticide application, weather and literature data for soil related parameters such as saturated water content, hydraulic conductivity or lateral distances of the drainage pipes without any further calibration of parameters.

This is a promising basis for using the model in a wide range of catchments. Marine environments are frequently exposed to oil spills as a result of transportation, oil drilling or fuel usage. Whereas large oil spills and their effects have been widely documented, more common and recurrent small spills typically escape attention. To fill this important gap in the assessment of oil-spill effects, we performed two independent supervised full sea releases of 5 m 3 of crude oil, complemented by on-board mesocosm studies and sampling of accidentally encountered slicks.

Selective decline of marine plankton is observed, equally relevant for early stages of larger spills. Our results demonstrate that, contrary to common thinking, even small spills have immediate adverse biological effects and their recurrent nature is likely to affect marine ecosystem functioning. Human land uses and population growth represent major global threats to biodiversity and ecosystem services. Understanding how biological communities respond to multiple drivers of human-induced environmental change is fundamental for conserving ecosystems and remediating degraded habitats.

We used different taxonomy and trait-based community descriptors to establish the most sensitive indicators detecting impacts and to help elucidate potential causal mechanisms of change. The SPEAR Index a trait-based indicator of sensitivity to pesticides was more sensitive to the relative input of effluent, suggesting that toxic influences of wastewater scale with dilution. Whilst freshwater pollution continues to be a major environmental problem, our findings highlight that the same anthropogenic pressure i.

Thus, remediation strategies aiming to improve stream ecological status e. DNA-encoded chemical libraries DECLs are collections of organic compounds that are individually linked to different oligonucleotides, serving as amplifiable identification barcodes. As all compounds in the library can be identified by their DNA tags, they can be mixed and used in affinity-capture experiments on target proteins of interest.

In this protocol, we describe the screening process that allows the identification of the few binding molecules within the multiplicity of library members.

First, the automated affinity selection process physically isolates binding library members. The resulting selection fingerprints facilitate the discrimination of binding from nonbinding library members. Comprehensive Analytical Chemistry , 23 pp. Iron is present in virtually all terrestrial and aquatic environments, where it participates in redox reactions with surrounding metals, organic compounds, contaminants, and microorganisms.

The model is applied to Macondo reservoir fluid released during the Deepwater Horizon disaster, represented with — pseudocomponents, including — individual compounds.

Additionally, the simulated densities of emitted petroleum fluids affect previous estimates of the volumetric flow rate of dead oil from the emission source. The OECD guidelines and define simulation tests aimed at assessing biotransformation of chemicals in water-sediment systems. They should serve the estimation of persistence indicators for hazard assessment and half-lives for exposure modeling.

Although dissipation half-lives of the parent compound are directly extractable from OECD data, they are system-specific and mix up phase transfer with biotransformation. In contrast, aerobic biotransformation half-lives should be easier to extract from OECD experiments with suspended sediments.

Therefore, there is scope for OECD tests with suspended sediment to serve as a proxy for degradation in the aerobic phase of the more complicated OECD test, but that correspondence has remained untested so far. We used Bayesian calibration and uncertainty assessment to calibrate the models for individual experimental types separately and for combinations of experimental types.

However, its uncertainty remained significant when calibrated on individual systems alone. A model to predict the mass flows and concentrations of pharmaceuticals predominantly used in hospitals across a large number of sewage treatment plant STP effluents and river waters was developed at high spatial resolution.

In the modeled base scenario, domestic , pharmaceutical use was geographically distributed according to the population size served by the respective STPs. Distinct hospital scenarios were set up to evaluate how the predicted results were modified when pharmaceutical use in hospitals was allocated differently; for example, in proportion to number of beds or number of treatments in hospitals.

The hospital scenarios predicted the mass flows and concentrations up to 3. Field measurements showed that ICM and gadolinium were predicted best by the scenarios using number of beds or treatments in hospitals with the specific facilities i.

Pharmaceuticals used both in hospitals and by the general population e. Our study demonstrated that the bed number-based hospital scenarios were effective in predicting the geographical distribution of a diverse range of pharmaceuticals in STP effluents and rivers, while the domestic scenario was similarly effective on the scale of large river-catchments.

The biotransformation of some micropollutants has previously been observed to be positively associated with ammonia oxidation activities and the transcript abundance of the archaeal ammonia monooxygenase gene amoA in nitrifying activated sludge. Given the increasing interest in and potential importance of ammonia-oxidizing archaea AOA , we investigated the capabilities of an AOA pure culture, Nitrososphaera gargensis , to biotransform ten micropollutants belonging to three structurally similar groups i.

The first hydroxylation step is typically catalyzed by monooxygenases. Their tentative structures and possible biotransformation pathways were proposed. The biotransformation of MIA and RAN only occurred when ammonia oxidation was active, suggesting cometabolic transformations. Consistently, a comparative proteomic analysis revealed no significant differential expression of any protein-encoding gene in N.

Taken together, this study provides first important insights regarding the roles played by AOA in micropollutant biotransformation. Compound-specific isotope analysis CSIA is a promising approach for tracking biotransformation of organic pollutants, but isotope fractionation associated with aromatic oxygenations is only poorly understood. We investigated the dioxygenation of a series of nitroaromatic compounds to the corresponding catechols by two enzymes, namely, nitrobenzene and 2-nitrotoluene dioxygenase NBDO and 2NTDO to elucidate the enzyme- and substrate-specificity of C and H isotope fractionation.

While the apparent 13 C- and 2 H-kinetic isotope effects of nitrobenzene, nitrotoluene isomers, 2,6-dinitrotoluene, and naphthalene dioxygenation by NBDO varied considerably, the correlation of C and H isotope fractionation revealed a common mechanism for nitrobenzene and nitrotoluenes.

Similar observations were made for the dioxygenation of these substrates by 2NTDO. Evaluation of reaction kinetics, isotope effects, and commitment-to-catalysis based on experiment and theory showed that rates of dioxygenation are determined by the enzymatic O 2 activation and aromatic C oxygenation. Because aromatic dioxygenation by nonheme iron dioxygenases is frequently the initial step of biodegradation, O 2 activation kinetics may also have been responsible for the minor isotope fractionation reported for the oxygenation of other aromatic contaminants.

Biotransformation is a key process that can greatly influence the bioaccumulation potential and toxicity of organic compounds. In this study, biotransformation of seven frequently used azole fungicides triazoles: Additionally, temporal trends of the whole body internal concentrations of epoxiconazole, prochloraz, and their respective biotransformation products BTPs were studied to gain insight into toxicokinetic processes such as uptake, elimination and biotransformation. By the use of high resolution tandem mass spectrometry in total 37 BTPs were identified.

Between one ketoconazole and six epoxiconazole BTPs were identified per parent compound except for prochloraz, which showed extensive biotransformation reactions with 18 BTPs detected that were mainly formed through ring cleavage or ring loss. In general, most BTPs were formed by oxidation and conjugation reactions.

Ring loss or ring cleavage was only observed for the imidazoles as expected from the general mechanism of oxidative ring openings of imidazoles, likely affecting the bioactivity of these BTPs. Overall, internal concentrations of BTPs were up to 3 orders of magnitude lower than that of the corresponding parent compound.

Thus, biotransformation did not dominate toxicokinetics and only played a minor role in elimination of the respective parent compound, with the exception of prochloraz. Studies according to OECD and OECD are performed to simulate the biodegradation of chemicals in water—sediment systems in support of persistence assessment and exposure modeling.

However, several shortcomings of OECD have been identified that hamper data evaluation and interpretation, and its relation to OECD is still unclear. The present study systematically compares OECD and OECD and two variants thereof to derive recommendations on how to experimentally address any shortcomings and improve data for persistence and risk assessment. To this end, four 14 C-labeled compounds with different biodegradation and sorption behavior were tested across standard OECD and test systems and two modified versions thereof.

The well-degradable compounds showed slow equilibration and the least mineralization in OECD , whereas the modified systems provided the highest degree of mineralization. Different lines of evidence suggest that this was due to increased oxygenation of the sediment in the modified systems. Particularly for rapidly degrading compounds, non-extractable residue formation was in line with degradation and did not follow the sediment—water ratio.

For the two more slowly degrading compounds, sorption in OECD standard and modified increased with time beyond levels proposed by equilibrium partitioning, which could be attributed to the grinding of the sediment through the stirring of the sediment suspension. Overall, the large differences in degradation observed across the four test systems suggest that refined specifications in test guidelines are required to reduce variability in test outcomes. At the same time, the amount of sediment and its degree of oxygenation emerged as drivers across all test systems.

This suggests that a unified description of the systems was possible and would pave the way toward a more consistent consideration of degradation in the water—sediment systems across different exposure situations and regulatory frameworks. To study the influence of aqueous solvent on the electronic energy levels of dissolved organic molecules, we conducted liquid microjet photoelectron spectroscopy PES measurements of the aqueous vertical ionization energies VIE aq of aniline 7.

We also reanalyzed previously reported experimental PES data for phenol, phenolate, thymidine, and protonated imidazolium cation. Experimental and computational data enable us to decompose the VIE aq into elementary processes. Detailed computational analysis of the flexible molecule veratrole alcohol reveals that the VIE is strongly dependent on molecular conformation in both gas and aqueous phases. Finally, aqueous reorganization energies of the oxidation half-cell ionization reaction were determined from experimental data or estimated from simulation for the six compounds aniline, phenol, phenolate, veratrole alcohol, dimethylsulfide, and methoxybenzene, revealing a surprising constancy of 2.

Chloramines, bromamines, and bromochloramines are halogen-containing oxidants that arise from the reaction of hypohalous acids with ammonia in water. Although relevant to both water disinfection chemistry and biochemistry, these molecules are difficult to study in the laboratory, and their thermochemical properties remain poorly established.

We developed a benchmark level ab initio calculation protocol, termed TA14, adapted from the Weizmann theory and Feller—Peterson—Dixon approaches to determine the molecular structures and thermochemical properties of these compounds.

We find that the halamine molecules are bound largely, and in some cases entirely, by electron correlation forces. This presumably explains their high reactivity as electrophilic oxidants. Reported thermochemical data enable the determination of equilibrium constants for reactions involving halamines, opening possibilities for more quantitative studies of the chemistry of these poorly understood compounds.

In recent years, a large number of urine-diverting dehydration toilets UDDTs have been installed in eThekwini to ensure access to adequate sanitation. The initial purpose of these toilets was to facilitate faeces drying, while the urine was diverted into a soak pit. This practice can lead to environmental pollution, since urine contains high amounts of nutrients. Instead of polluting the environment, these nutrients should be recovered and used as fertiliser.

In the international and transdisciplinary research project VUNA was initiated in order to explore technologies and management methods for better urine management in eThekwini. Three treatment technologies have been chosen for the VUNA project. The first is struvite precipitation, a technology which has already been tested in multiple projects on urine treatment.

Struvite precipitation is a simple and fast process for phosphorus recovery. Other nutrients, such as nitrogen and potassium, remain in the effluent and pathogens are not completely inactivated. Therefore, struvite precipitation has to be combined with other treatment processes to prevent environmental pollution and hygiene risks.

The second process is a combination of nitrification and distillation. This process combination is more complex than struvite precipitation, but it recovers all nutrients in one concentrated solution, ensures safe sanitisation and produces only distilled water and a small amount of sludge as by-products. The third process is electrolysis.

This process could be used for very small on-site reactors, because conversion rates are high and the operation is simple, as long as appropriate electrodes and voltages are used.

However, nitrogen is removed and not recovered and chlorinated by-products are formed, which can be hazardous for human health. Ozone transforms various organic compounds that absorb light within the UV and visible spectra. UV absorbance can therefore be used to detect the transformation of chemicals during ozonation.

In wastewater, decolourisation can be observed after ozonation. This study investigates the correlation of the UV absorbance difference between the ozonation inlet and outlet and the removal efficiency of micropollutants in wastewater. The absorbance at and nm was measured at the ozonation inlet and outlet, as was the concentration of 24 representative micropollutants and the dissolved organic carbon DOC.

We therefore suggest that UV absorbance can be used as a feedback control parameter to achieve optimal ozone dosage in wastewater treatment plants and to gain a fast insight into the process efficiency and stability of the ozonation. Incomplete micropollutant elimination in wastewater treatment plants WWTPs results in transformation products TPs that are released into the environment. Improvements in analytical technologies have allowed researchers to identify several TPs from specific micropollutants but an overall picture of nontarget TPs is missing.

This procedure was then applied to nontarget data, and 20 potential parent and TP pairs were selected for identification. In summary, primarily a surfactant homologue series, with associated TPs, was detected. Some obstacles still remain, including spectral interferences from coeluting compounds and identification of TPs, whose structures are less likely to be present in compound databases. The workflow was developed using openly accessible tools and, after parameter adjustment, could be applied to any data set with before and after information about various biological or chemical processes.

The assessment of oxidative pollutant biotransformation by compound specific isotope analysis CSIA is often complicated by the variability of kinetic isotope effects associated with carbon oxygenation in enzymatic reactions.

Here, we illustrate how information about the kinetics of oxidative biocatalysis by flavin-dependent monooxygenases FMOs enables one to assess if CSIA could be applied for tracking contaminant biodegradation. In "cautious" FMOs, which form reactive flavin hydro peroxide species after substrate binding, the monooxygenation of organic compounds is not rate-determining and consequently does not lead to substrate isotope fractionation.

Conversely, "bold" FMOs generate hydroperoxides regardless of substrate availability, and substrate disappearance is thus subject to isotope fractionation trends, which are typical for hydroxylation reactions.

Because monooxygenations of aromatic moieties are often initial steps of organic pollutant transformation, knowledge of the kinetics of FMOs and other oxidative enzymes can support decisions regarding the use of CSIA.

The occurrence and levels of perfluoroalkyl acids PFAAs emitted from specific pollution sources into the aquatic environment in Switzerland were studied using digested sewage sludges from 45 wastewater treatment plants in catchments containing a wide range of potential industrial emitters. Concentrations of individual PFAAs show a high spatial and temporal variability, which infers different contributions from industrial technologies and activities.

Even if wastewater may be expected to be diluted between 10 and times by the receiving waters, elevated concentrations may be reached at specific locations. Although sewage sludge is a minor compartment for PFAAs in WWTPs, these investigations are helpful for the identification of hot-spots from industrial emitters as well as to estimate monthly average concentrations in wastewater. Environmental quality monitoring of water resources is challenged with providing the basis for safeguarding the environment against adverse biological effects of anthropogenic chemical contamination from diffuse and point sources.

While current regulatory efforts focus on monitoring and assessing a few legacy chemicals, many more anthropogenic chemicals can be detected simultaneously in our aquatic resources. However, exposure to chemical mixtures does not necessarily translate into adverse biological effects nor clearly shows whether mitigation measures are needed.

Thus, the question which mixtures are present and which have associated combined effects becomes central for defining adequate monitoring and assessment strategies. Here we describe the vision of the international, EU-funded project SOLUTIONS, where three routes are explored to link the occurrence of chemical mixtures at specific sites to the assessment of adverse biological combination effects. First of all, multi-residue target and non-target screening techniques covering a broader range of anticipated chemicals co-occurring in the environment are being developed.

By improving sensitivity and detection limits for known bioactive compounds of concern, new analytical chemistry data for multiple components can be obtained and used to characterise priority mixtures.

This information on chemical occurrence will be used to predict mixture toxicity and to derive combined effect estimates suitable for advancing environmental quality standards.

Secondly, bioanalytical tools will be explored to provide aggregate bioactivity measures integrating all components that produce common adverse outcomes even for mixtures of varying compositions. The ambition is to provide comprehensive arrays of effect-based tools and trait-based field observations that link multiple chemical exposures to various environmental protection goals more directly and to provide improved in situ observations for impact assessment of mixtures.

Thirdly, effect-directed analysis EDA will be applied to identify major drivers of mixture toxicity. Refinements of EDA include the use of statistical approaches with monitoring information for guidance of experimental EDA studies. These three approaches will be explored using case studies at the Danube and Rhine river basins as well as rivers of the Iberian Peninsula. The synthesis of findings will be organised to provide guidance for future solution-oriented environmental monitoring and explore more systematic ways to assess mixture exposures and combination effects in future water quality monitoring.

The OECD guideline describes a laboratory test method to assess aerobic and anaerobic transformation of organic chemicals in aquatic sediment systems and is an integral part of tiered testing strategies in different legislative frameworks for the environmental risk assessment of chemicals. The results from experiments carried out according to OECD are generally used to derive persistence indicators for hazard assessment or half-lives for exposure assessment. We used Bayesian parameter estimation and system representations of various complexities to systematically assess opportunities and limitations for estimating these indicators from existing data generated according to OECD for 23 pesticides and pharmaceuticals.

We found that there is a disparity between the uncertainty and the conceptual robustness of persistence indicators. Disappearance half-lives are directly extractable with limited uncertainty, but they lump degradation and phase transfer information and are not robust against changes in system geometry. Transformation half-lives are less system-specific but require inverse modeling to extract, resulting in considerable uncertainty.

Available data were thus insufficient to derive indicators that had both acceptable robustness and uncertainty, which further supports previously voiced concerns about the usability and efficiency of these costly experiments.

This should, however, be accompanied by a mandatory reporting or full standardization of the geometry of the experimental system. We recommend transformation half-lives determined by inverse modeling to be used as input parameters into fate models for exposure assessment, if due consideration is given to their uncertainty.

Biodiversities can differ substantially among different wastewater treatment plant WWTP communities. Whether differences in biodiversity translate into differences in the provision of particular ecosystem services, however, is under active debate. Theoretical considerations predict that WWTP communities with more biodiversity are more likely to contain strains that have positive effects on the rates of particular ecosystem functions, thus resulting in positive associations between those two variables.

However, if WWTP communities were sufficiently biodiverse to nearly saturate the set of possible positive effects, then positive associations would not occur between biodiversity and the rates of particular ecosystem functions. To test these expectations, we measured the taxonomic biodiversity, functional biodiversity, and rates of 10 different micropollutant biotransformations for 10 full-scale WWTP communities.

We have demonstrated that biodiversity is positively associated with the rates of specific, but not all, micropollutant biotransformations. Thus, one cannot assume whether or how biodiversity will associate with the rate of any particular micropollutant biotransformation. We have further demonstrated that the strongest positive association is between biodiversity and the collective rate of multiple micropollutant biotransformations. Thus, more biodiversity is likely required to maximize the collective rates of multiple micropollutant biotransformations than is required to maximize the rate of any individual micropollutant biotransformation.

We finally provide evidence that the positive associations are stronger for rare micropollutant biotransformations than for common micropollutant biotransformations.

Together, our results are consistent with the hypothesis that differences in biodiversity can indeed translate into differences in the provision of particular ecosystem services by full-scale WWTP communities. Redox-active minerals are ubiquitous in the environment and are involved in numerous electron transfer reactions that significantly affect biogeochemical processes and cycles as well as pollutant dynamics.

As a consequence, research in different scientific disciplines is devoted to elucidating the redox properties and reactivities of minerals. This review focuses on the characterization of mineral redox properties using electrochemical approaches from an applied bio geochemical and environmental analytical chemistry perspective.

Establishing redox equilibria between the minerals and working electrodes is a major challenge in electrochemical measurements, which we discuss in an overview of traditional electrochemical techniques.

These issues can be overcome with mediated electrochemical analyses in which dissolved redox mediators are used to increase the rate of electron transfer and to facilitate redox equilibration between working electrodes and minerals in both amperometric and potentiometric measurements.

Using experimental data on an iron-bearing clay mineral, we illustrate how mediated electrochemical analyses can be employed to derive important thermodynamic and kinetic data on electron transfer to and from structural iron.

We summarize anticipated methodological advancements that will further contribute to advance an improved understanding of electron transfer to and from minerals in environmentally relevant redox processes. A review of nonmediated and mediated approaches , Environmental Science and Technology , 49 10 , , doi: Using macroinvertebrates as ecological indicators for different stressors has a long tradition.

However, when applied to field data, one often observes correlations between different macroinvertebrate indices that can be attributed to both correlations of stressors and inherent correlations due to the sensitivity of taxa to different stressors. Ignoring the source of any given correlation leads to ambiguous conclusions about the impact of different stressors. Here, we demonstrate how to distinguish the causes of correlation by means of Monte Carlo simulations.

We assessed to which degree trait-based indices are stressor-specific and whether this depends on the pool of taxa and its taxonomic resolution. Therefore, we 1 analysed the frequencies of "sensitive" and "insensitive" taxa for pairwise combinations of different indices, 2 analysed the inherent correlation of indices with random samples from different taxon pools derived from field samples and from a complete species list of a whole ecoregion, and 3 compared this inherent correlation with the actual correlation of the field samples.

We used these new indices to illustrate our approach while in-depth testing of their applicability was not the focus of our study. The probability that this correlation is only due to inherent correlation in the taxa sensitivities was low maximum of 0. The problem of inherent correlation between indices is more severe for the smaller taxon pool with lower taxonomic resolution. Correlation in the sensitivity of different taxa to different stressors leads to an inherent correlation in trait-based indices, which weakens their explanatory power.

Our results highlight the importance of correlation analyses when using trait-based indices to guide ecosystem-management, especially in regions with reduced biodiversity. There is increasing interest in using meta-omics association studies to investigate contaminant biotransformations. The general strategy is to characterize the complete set of genes, transcripts, or enzymes from in situ environmental communities and use the abundances of particular genes, transcripts, or enzymes to establish associations with the communities' potential to biotransform one or more contaminants.

The associations can then be used to generate hypotheses about the underlying biological causes of particular biotransformations. While meta-omics association studies are undoubtedly powerful, they have a tendency to generate large numbers of non-causal associations, making it potentially difficult to identify the genes, transcripts, or enzymes that cause or promote a particular biotransformation. In this perspective, we describe general scenarios that could lead to pervasive non-causal associations or conceal causal associations.

We next explore our own published data for evidence of pervasive non-causal associations. Finally, we evaluate whether causal associations could be identified despite the discussed limitations. Analysis of our own published data suggests that, despite their limitations, meta-omics association studies might still be useful for improving our understanding and predicting the contaminant biotransformation capacities of microbial communities.

Water Research and Technology , 1 3 , , doi: Five rivers with different agricultural and urban influences were monitored from March to July with two methods i two-week time-proportional composite water samples and ii two-week passive sampler deployment.

This study showed that SDB passive samplers are well-suited for the qualitative screening of polar micropollutants because the number of detected substances was similar for SDB samples vs. The determination of in-situ calibrated sampling rates field R s was possible for 88 compounds where the R 2 from the regression water concentration vs.

It was observed that ionic species had significantly lower field R s than neutral species. Due to the complexity of the different transport processes, a correlation between determined field R s and logD ow could only predict R s with large uncertainties. We conclude that only substances with relatively constant river concentrations can be quantified accurately in the field by passive sampling if substance-specific R s are determined.

For that purpose, the proposed in-situ calibration is a very robust method and the substance specific R s can be used in future monitoring studies in rivers with similar environmental conditions i. Bisher haben Insektizide und Fungizide im Fliessgewässermonitoring eine untergeordnete Rolle gespielt. Instead of modeling 3D objects as surface models, we use a volumetric subdivision representation. Volumetric modeling operations allow designing 3D objects in similar ways as with surface-based modeling tools.

Encoding the volumetric information already in the design mesh drastically simplifies and speeds up the mesh generation process for simulation. The transition between design, simulation and back to design is consistent and computationally cheap.

Since the subdivision and mesh generation can be expressed as a precomputable matrix-vector multiplication, iteration times can be greatly reduced compared to common modeling and simulation setups. Therefore, this approach is especially well suited for early-stage modeling or optimization use cases, where many geometric changes are made in a short time and their physical effect on the model has to be evaluated frequently.

To test our approach, we created, simulated and adapted several 3D models. Additionally, we measured and evaluated the timings for generating and applying the matrices for different subdivision levels. For comparison, we also measured the tetrahedral meshing functionality offered by CGAL for similar numbers of elements.

For changing topology, our implicit meshing approach proves to be up to 70 times faster than creating the tetrahedral mesh only based on the outer surface. Without changing the topology and by precomputing the matrices, we achieve a speed-up of up to We present an approach for integrating interactive design and simulation for customizing parameterized 3D models. Instead of manipulating the mesh directly, a simplified interface for casual users allows for adapting intuitive parameters, such as handle diameter or height of our example object - a cup holder.

The transition between modeling and simulation is performed with a volumetric subdivision representation, allowing direct adaption of the simulation mesh without re-meshing. Our GPU-based FEM solver calculates deformation and stresses for the current parameter configuration within seconds with a pre-defined load case. If the physical constraints are met, our system allows the user to 3D print the object.

Otherwise, it provides guidance which parameters to change to optimize stability while adding as little material as possible based on a finite differences optimization approach. The speed of our GPU-solver and the fluent transition between design and simulation renders the system interactive, requiring no pre-computation. Cloud computing rekindles old and imposes new challenges on remote visualization especially for interactive 3D graphics applications, e. In this paper we present and discuss an approach entitled 'rich pixels' short 'rixels' that balances the requirements concerning security and interactivity with the possibilities of hardware accelerated post-processing and rendering, both on the server side as well as on the client side using WebGL.

In this paper, we present a novel volumetric mesh representation suited for parallel computing on modern GPU architectures. The data structure is based on a compact, ternary sparse matrix storage of boundary operators. Boundary operators correspond to the first-order top-down relations of k-faces to their k-1 -face facets. The compact, ternary matrix storage format is based on compressed sparse row matrices with signed indices and allows for efficient parallel computation of indirect and bottomup relations.

This representation is then used in the implementation of several parallel volumetric mesh algorithms including Laplacian smoothing and volumetric Catmull-Clark subdivision. In this paper, we present a novel approach for a tighter integration of 3D modeling and physically- based simulation.

Volumetric modeling operations allow designing 3D objects in similar ways as with surface-based modeling tools, while automatic checks and modifications of inner control points ensure consistency during the design process. We measured and evaluated the timings for generating and applying the matrices for different subdivision levels.

Additionally, we computed several characteristic factors for mesh quality and mesh consistency. For comparison, we analyzed the tetrahedral meshing functionality offered by CGAL for similar numbers of elements. Without changing the topology and by precomputing the matrices, we achieve a speed-up of up to , as all the required information is already available.

Gutachter]; Mueller-Roemer, Johannes [2. Kurz darauf folgte die Publikation von Ataer-Cansizoglu et al. Ein Nachteil dieser Verfahren ist die hohe Verarbeitungszeit eines Registrierungsschrittes. Dieser bewirkt, dass die Verfahren nicht in der Lage sind, interaktive Rekonstruktionen durchzuführen. Im Gegensatz zu bestehenden Verfahren wird in dieser Arbeit ein lokaler Registrierungsalgorithmus umgesetzt. Flächenmerkmale werden bevorzugt verwendet, da ihre Anzahl in Szenen signifikant geringer ist als die von Punkten.

Das ermöglicht eine schnellere Korrespondenzsuche und Registrierung. Somit ist der Algorithmus in der Lage, die Registrierung auch in texturarmen Bereichen mit wenigen geometrischen Merkmalen durchzuführen, in denen Techniken, welche nur Punkte benutzen, scheitern. Des Weiteren ermöglicht der lokale Registrierungsansatz eine interaktive Nutzung, um dem Nutzer in Echtzeit Rückmeldung über den Registrierungsprozess zu geben.

Zusätzlich implementierte Erweiterungen, welche die detektierten Flächeninformationen zur Geometriekorrektur ausnutzen, unterstützen den Registrierungsvorgang. Zudem weist das System im Gegensatz zu vergleichbaren hybriden Systemen eine sechsfach höhere Rekonstruktionsrate auf.

Engineering im Zusammenhang mit Industrie 4. Vernetzte smarte Produkte und Services eröffnen grundlegend neue Potenziale für Innovationen und neue Geschäftsfelder. Im Rahmen intensiver Gespräche mit Vertreterinnen und Vertretern aus der Wirtschaft wurden einige dieser Fragen und Innovationspotenziale adressiert und bewertet.

Gutachter]; Sevilmis, Neyir [2. This bachelor thesis will propose a novel approach for recognizing transport boxes on pallets in goods receipt in logistics. Especially if pallets are not labeled with barcodes or RFID-tags, it is very time consuming, to manually recognize and count the incoming boxes. Therefore, a measuring station, in which the pallets are recognized, is presented.

Two images of opposite sides of the pallet are taken and further processed by a modular software to recognize and count the transport boxes. The process is divided into three main steps. In the first step, the input images are segmented to gain one image snippet for each transport box present in the image. This is done by morphologically transforming the input images and extracting the boundaries of the boxes.

In a second step, each of the prior calculated image segments is classified by a multi-machine support vector machine, trained with histograms of visual words received by a Bag-of-Words model, in combination with colour information. Finally, the classification results of the images taken from two opposite perspectives of the pallet are merged to get an overall statement, of how many and which type of transport boxes are stacked on the pallet.

We present a novel multigrid scheme based on a cut-cell formulation on regular staggered grids which generates compatible systems of linear equations on all levels of the multigrid hierarchy. This geometrically motivated formulation is derived from a finite volume approach and exhibits an improved rate of convergence compared to previous methods. Existing fluid solvers with voxelized domains can directly benefit from this approach by only modifying the representation of the non-fluid domain.

The necessary building blocks are fully parallelizable and can therefore benefit from multi- and many-core architectures. Gutachter]; Altenhofen, Christian [2. In this thesis a subdivision-based method is presented for calculating numerical solutions to differential equations on the basis of a geometric representation that is also well suited for modeling.

The conversion between these two representations can become a hugely time consuming process. Utilizing the same representation for modeling and simulating objects speeds up the whole engineering process, as the need for mesh generation is eliminated. This also reduces the error made by approximating the geometry. As subdivision schemes are intuitive and efficient to use for modeling and visualizing complex geometries, they serve well as a basis for this method.

The presented method is based on Chaikin's algorithm for one-dimensional objects and utilizes Catmull-Clark surfaces to represent two-dimensional objects. On the basis of these two subdivision schemes, solutions to the heat equation are generated, demonstrating the applicability of the approach.

The exactness of this solution and the performance of the algorithm are compared to a traditional FEM approach to the heat equation. Gutachter]; Borner, Matthias [2. Um aus bereits bestehenden Gebäuden ein BIM zu gewinnen, wird in der Forschung nach Verfahren gesucht, die dies möglichst einfach machen. Momentan wird dabei meistens in zwei Abschnitten gearbeitet. Falls dabei festgestellt wird, dass die Aufnahme für die Analyse nicht gut genug ist, besteht meist nicht oder nur unter erheblichem Mehraufwand die Möglichkeit, das Gebäude erneut aufzunehmen.

Deshalb wird eine Möglichkeit benötigt, noch während der Aufnahme feststellen zu können, ob ein Model erstellt werden kann.

Dazu müssen die verwendeten Verfahren robust sein und schnell arbeiten. In dieser Arbeit wird ein Objektanalyse genanntes Verfahren vorgestellt, das in Punktwolken Fenster und geöffnete Türen findet. Dieses Verfahren ist in eine ResourceApp genannte Anwendung eingebettet. Mit dieser können Räume aufgenommen werden, und in den bei der Aufnahme entstehenden Punktwolken Wände erkannt werden. Da die vorliegenden Punktwolken fehlerbehaftet und ungenau sein können, sind Verfahren aus anderen Arbeiten nicht einsetzbar.

Den Ansatz für die Objektanalyse liefert aber ein Prinzip, das man in vielen Arbeiten findet: Die Punkte der Punktwolke werden aus dem 3D- in den 2D-Raum projiziert und können dort mit bewährten Methoden aus der Bildverarbeitung untersucht werden.

Für die Projektion benutzt die Objektanalyse die Wandflächen des Raumes. Eine der Herausforderungen der Objektanalyse ist es, möglichst wenige Fehler zu machen. Dabei gibt es zwei Arten von Fehlern, false positive und false negative genannt. Das Verringern von einer Art führt meist zu einem Vermehren der anderen Art. Hier muss eine gute Abwägung getroffen werden.

Die Objektanalyse arbeitet in mehreren Schritten. Im ersten Schritt werden die Punkte der Punktwolke auf die Wandebenen projiziert und dadurch für jede Wand ein Bild erstellt. Die Objektanalyse wird von fünf Parametern gesteuert und kann damit für jeden Raum optimal angepasst werden. Dadurch können Objekte bearbeitet, gelöscht und erzeugt werden. Neben Türen und Fenstern stehen hier auch weitere Objekte zur Verfügung, um das Modell des Raumes um diese zu erweitern. Die Objektanalyse kann dann auf diesen bearbeiteten Wänden durchgeführt werden.

Getestet wurde die Objektanalyse anhand von sechs Räumen und auf zwei Rechnern. Die Ausführungszeit der Objektanalyse lag dabei immer unter einer halben Sekunde, im Schnitt sogar nur bei etwa ms. Auch bei der Genauigkeit liefert die Objektanalyse gute Ergebnisse: Von den zehn gut aufgenommenen Objekten wurden 9 erkannt, und sogar eine schlecht aufgenommen Tür konnte erkannt werden.

Zusammen mit der Möglichkeit zur manuellen Bearbeitung ist die Objektanalyse deshalb eine gute Lösung für die Problemstellung. We present a novel p-multigrid method for efficient simulation of corotational elasticity with higher-order finite elements.

In contrast to other multigrid methods proposed for volumetric deformation, the resolution hierarchy is realized by varying polynomial degrees on a tetrahedral mesh. The multigrid approach can be either used as a direct method or as a preconditioner for a conjugate gradient algorithm.

We demonstrate the efficiency of our approach and compare it to commonly used direct sparse solvers and preconditioned conjugate gradient methods. As the polynomial representation is defined w. We introduce the use of cubic finite elements for volumetric deformation and investigate different combinations of polynomial degrees for the hierarchy.

We analyze the applicability of cubic finite elements for deformation simulation by comparing analytical results in a static and dynamic scenario and demonstrate our algorithm in dynamic simulations with quadratic and cubic elements. Applying our method to quadratic and cubic finite elements results in a speed-up of up to a factor of 7 for solving the linear system.

Gutachter]; Weber, Daniel [2. In dieser Arbeit wird ein Zeitintegrationsverfahren vorgestellt, mit dem die Bewegungsgleichung eines Masse-Feder-Systems durch ein Optimierungsproblem gelöst werden kann. Das zur Lösung des Optimierungsproblems resultierende lineare Gleichungssystem LGS besitzt eine konstante Systemmatrix, die einmalig berechnet werden muss. Weiterhin kann mit dieser speziellen Datenstruktur eine nahezu vollständige Ausnutzung der Speicherbandbreite der Grafikkarte realisiert werden.

Scott; Wong, Pak Chung. The answers compiled here aren't meant to be all encompassing or deterministic when it comes to the opportunities computer graphics and interactive visualization hold for the future.

This thesis covers interactive physically based simulation for applications such as computer games or virtual environments. A simple way to achieve this goal is to drastically limit the resolution in order to guarantee this low computation time. However, with current methods the number of degrees of freedom will be rather low, which results in a low degree of realism. This is due to the fact that not every detail that is important for realistically representing the physical system can be resolved.

This thesis contributes to interactive physically based simulation by developing novel methods and data structures. These can be associated with the three pillars of this thesis: The novel approaches are evaluated in two application areas relevant in computer generated animation: The resulting accelerations allow for a higher degree of realism because the number of elements or the resolution can be significantly increased.

Diese Masterarbeit behandelt die interaktive physikalisch-basierte Simulation auf verteilten Systemen. Es wird untersucht, wie bestehende Algorithmen und Datenstrukturen für interaktive und nicht-interaktive Simulationen adaptiert werden müssen, um auf verteilten Systemen verwendet zu werden. Ein verteiltes System besteht aus mehreren Grafikkarten, die zusätzlich auf Rechenknoten im Netzwerk verteilt sein können.

Als Algorithmen werden das Verfahren der konjugierten Gradienten und das Mehrgitterverfahren untersucht. Gutachter]; Kuijper, Arjan [2.

Hierbei navigiert ein Benutzer interaktiv durch eine Visualisierung des Datensatzes um relevante Ergebnisse zu lokalisieren. Dies ist ein anwendungsspezifischer Vorgang um zum Beispiel unerwartete Daten zu finden oder solche, die eine Hypothese bestätigen oder widerlegen. Limitierende Elemente in diesem Anwendungsszenario sind der lokale Haupt- und Grafikkartenspeicher, die lokale Rechenleistung, sowie die für die Datenübertragung zur Verfügung stehende Netzwerkbandbreite.

In dieser Arbeit wird ein Ansatz basierend auf den Vorarbeiten von Get et al. Dieser führt, gesteuert durch den Client, eine blickpunktabhängige punktbasierte Abtastung auf einem entfernten oberflächenbezogenen Simulationsdatensatz aus. Die daraus inkrementell gewonnene Punktwolke aus Simulationsdaten wird in einem Octree, der zwischen Client und Server synchronisiert wird, organisiert und temporär zwischengespeichert. Auf dem Client wird daraufhin mittels eines splatbasierten Rekonstruktionsverfahrens die Oberfläche des Simulationsdatensatzes wiederhergestellt.

Anhand eines kleinen Beispieldatensatzes wird zum Schluss die Praktikabilität des Verfahrens experimentell evaluiert. Die direkte Abhängigkeit der interaktiv visualisierbaren Datenmengen von den serverseitig verfügbaren Ressourcen kann jedoch aufgrund der Vorarbeiten von Ge et al. Creating an intuitive way of modeling and designing different structures is a task of high relevance with a wide range of applications in computer graphics.

Various options will have to be considered with respect to new developments. For instance, the following options are addressed in this thesis: The availability of modern and low-cost 3D hardware provides the users with new possibilities. Visualization and interactions can be performed in 3D and new approaches to applying the interactions are required. In this regard, a combination of zSpace tablet and Leap Motion is proven to fulfill these requirements.

Interactions with the mesh can be implemented with the zSpace tablet and Leap Motion is used to control the navigation in scene and menu. Overall, some relevant and new 3D interactions are developed which, amongst other things, allow for the manipulation of inner structures. Those 3D interactions are meaningful, useful and intuitive and therefore, this thesis contributes to the current developments. Rücker, Marko; Altenhofen, Christian [1.

In der frühen Designphase eines realen Objektes z. Strömungs-, Stabilitätseigenschaften gerecht wird. Es gibt traditionell zwei Ansätze: Diese Arbeit beschäftigt sich mit einigen Aspekten, die beiden Ansätze miteinander zu verbinden. Es sollen Vorteile der Modellierung an einem realen Objekt und der schnellen Beurteilung von Eigenschaften eines virtuellen Objekts durch Simulation verbunden werden.

Dabei wird ein Ansatz gezeigt, bei dem die Triangulierung einer Punktewolke von ca. Weiterhin bietet dieser Ansatz eine qualitativ bessere Triangulierung bei gleichbleibender bzw. Over the past few years, there has been a clear trend toward more individualized and customized products that require increasing flexibility in production. Globally, this effort would also reinforce economies traditionally strong in engineering and manufacturing. The present paper describes the development of a modular and easily configurable simulation platform for ground vehicles.

This platform should be usable for the implementation of driving simulators employed both in training purposes and in vehicle components testing. In particular, the paper presents a first architectural model for the implementation of a simulation platform based on the Functional Digital Mock-Up FDMU approach. This platform will allow engineers to implement different kinds of simulators that integrate both physical and virtual components, thus achieving the possibility to quickly reconfigure the architecture depending on the hardware and software used and on specific test case needs.

We present a novel p-multigrid method for efficient simulation of co-rotational elasticity with higher-order finite elements. We analyze the applicability of cubic finite elements for deformation simulation by comparing analytical results in a static scenario and demonstrate our algorithm in dynamic simulations with quadratic and cubic elements.

Applying our method to quadratic and cubic finite elements results in speed up of up to a factor of 7 for solving the linear system. A large number of photographs of cultural heritage items and monuments is publicly available in various Open Access Image Repositories OAIR and social media sites. Metadata inserted by camera, user and host site may help to determine the photograph content, geo-location and date of capture, thus allowing us, with relative success, to localise photos in space and time.

Additionally, developments in Photogrammetry and Computer Vision, such as Structure from Motion SfM , provide a simple and cost-effective method of generating relatively accurate camera orientations and sparse and dense 3D point clouds from 2D images.

Our main goal is to provide a software tool able to run on desktop or cluster computers or as a back end of a cloud-based service, enabling historians, architects, archaeologists and the general public to search, download and reconstruct 3D point clouds of historical monuments from hundreds of images from the web in a cost-effective manner.

The end products can be further enriched with metadata and published. This paper describes a workflow for searching and retrieving photographs of historical monuments from OAIR, such as Flickr and Picasa, and using them to build dense point clouds using SfM and dense image matching techniques. Computational efficiency is improved by a technique which reduces image matching time by using an image connectivity prior derived from low-resolution versions of the original images.

Benchmarks for two large datasets showing the respective efficiency gains are presented. Künftig stimmen sich die Produktionsanlagen untereinander ab und steuern sich selbst, die Produkte wissen, wo sie sich befinden und was ihnen zum fertigen Artikel noch fehlt. This paper presents a new technique for automatic, interactive 3D subdomaining coupled to mesh and simulation refinements in order to enhance local resolutions of CAE domains.

Numerical simulations have become crucial during the product development process PDP for predicting different properties of new products as well as the simulation of various kinds of natural phenomena. Most of the time, engineers are interested in a deeper understanding of local quantities rather than being exposed to an iterative re-simulation of the overall domain.

New techniques for automatic and interactive processes are then challenged by the cardinality and structural complexity of the CAE domain. This paper introduces a new interactive technique that automatically reduces the analysis space, and allows engineers to enhance the resolution of local problems without a need for recalculating the global problem. The technique, integrated into a VR based front end, achieves faster reanalysis cycles compared with traditional COTS tool chains and engineering workflows.

Europe is rich in cultural heritage but unfortunately much of the tens of millions of artifacts remain in archives. Many of these resources have been collected to preserve our history and to understand their historical context. Nevertheless, CH institutions are neither able to document all the collected resources nor to exhibit them.

Additionally, many of these CH resources are unique, and will be on public display only occasionally. Hence, access to and engagement with this kind of cultural resources is important for European culture and the legacy of future generations. However, the technology needed to economically mass digitize and annotate 3D artifacts in analogy to the digitization and annotation of books and paintings has yet to be developed.

Likewise approaches to semantic enrichment and storage of 3D models along with meta-data are just emerging. This paper presents challenges and trends to overcome the latter issues and demonstrates latest developments for annotation of 3D artifacts and their subsequent export to Europeana, the European digital library, for integrated, interactive 3D visualization within regular web browsers taking advantage of technologies such as WebGl and X3D.

One of the main characteristics of the Internet era we are living in, is the free and online availability of a huge amount of data. This data is of varied reliability and accuracy and exists in various forms and formats.

Often, it is cross-referenced and linked to other data, forming a nexus of text, images, animation and audio enabled by hypertext and, recently, by the Web3. Search engines can search text for keywords using algorithms of varied intelligence and with limited success. Searching images is a much more complex and computationally intensive task but some initial steps have already been made in this direction, mainly in face recognition.

Our main goal is to enable historians, architects, archaeologists, urban planners and affiliated professionals to reconstruct views of historical monuments from thousands of images floating around the web.

We present graphics processing unit GPU data structures and algorithms to efficiently solve sparse linear systems that are typically required in simulations of multi-body systems and deformable bodies. Thereby, we introduce an efficient sparse matrix data structure that can handle arbitrary sparsity patterns and outperforms current state-of-the-art implementations for sparse matrix vector multiplication. Moreover, an efficient method to construct global matrices on the GPU is presented where hundreds of thousands of individual element contributions are assembled in a few milliseconds.

A finite-element-based method for the simulation of deformable solids as well as an impulse-based method for rigid bodies are introduced in order to demonstrate the advantages of the novel data structures and algorithms. These applications share the characteristic that a major computational effort consists of building and solving systems of linear equations in every time step.

Our solving method results in a speed-up factor of up to 13 in comparison to other GPU methods. The phases of the embodiment stage are sequentially conceived and in some domains even cyclic conceived. Nevertheless, there is no seamless integration between these, causing longer development processes, increment of time lags, loss of inertia, greater misunderstandings, and conflicts.

Embodiment Discrete Processing enables the seamless integration of three building blocks. These integrated building blocks support the multidisciplinary work between designers and analysts, which was previously unusual.

It creates a new understanding of what an integral processing is, whose phases were regarded as independent. Finally, it renders new opportunities toward a general purpose processing.

Kupfer im Gebäudebestand existieren. Despite many advances in mesh compression methods within the past two decades, there is still no consensus about a standardized compact mesh encoding format for 3D Web applications. In order to facilitate the design of a future platform-independent solution, this paper investigates the crucial trade-off between compactness of the compressed representation and decompression time.

Our case study evaluates different encoding formats, combined with various transmission bandwidths, using different client devices. Results indicate that good compression rates, and at the same time a fast decompression, can be achieved by exploiting existing browser features and by minimizing the complexity of operations that have to be performed inside the JavaScript layer. Our findings are summarized in concrete recommendations for future standards.

Virtual Augmented and Mixed Reality. Designing and Developing Augmented and Virtual Environments. This paper presents a new approach for the design and realization of a Virtual Reality VR based engineering front end that enables engineers to combine post processing tasks and finite element methods for linear static analyses at interactive rates.

Here, designers and engineers interact with the virtual mock-up, change boundary conditions BC , variate geometry or BCs and simulate and analyze its impact on the CAE mock-up. The potential of VR for post-processing engineering data enlightened ideas to deploy it for interactive investigations at conceptual stage.

While it is a valid hypothesis, still many challenges and problems remain due to the nature of the "change'n play" paradigm imposed by conceptual simulations as well as the non-availability of accurate, interactive FEM procedures. Intelligente Methoden, Prozesse und Technologien. Die Konzepte von Industrie 4. Nach unserer umfassenden Interpretation des Begriffs "Industrie 4.

In the restoration planning process a curator evaluates the condition of a Cultural Heritage CH object and accordingly develops a set of hypotheses for improving it. This iterative process is complex, time consuming and requires many manual interventions.

In this context, we propose interactive modeling techniques, based on subdivision surfaces, which can support the completion of CH objects toward restoration planning. The proposed technique starts with a scanned and incomplete object, represented by a triangle mesh, from which a subdivision surfaces can be generated.

Based on the mixed representation, sketching techniques and modeling operations can be combined to extend and refine the subdivision surface, according to the curator's hypothesis. Thus, curators without rigorous modeling experience can directly create and manipulate surfaces in a similar way as they would do it on a piece of paper.

We present the capabilities of the proposed technique on two interesting CH objects. Virtual assembly training systems show a high potential to complement or even replace physical setups for training of assembly processes in and beyond the automotive industry.

The precondition for the breakthrough of virtual training is that it overcomes the problems of former approaches. One key challenge to address when developing virtual assembly training is the extensive authoring effort for setting up virtual environments.

Although knowledge from the product and manufacturing design is available and could be used for virtual training, a concept for integration of this data is still missing. This paper presents the design of a platform which transfers available enterprise data into a unified model for virtual training and thus enables virtual training of workers at the assembly line before the physical prototypes exist.

The data requirements and constraints stemming from industrial partners involved in the project will be discussed. A second hurdle for virtual training is the insufficient user integration and acceptance.

In this context, the paper introduces an innovative hardware set-up for game-based user interaction, which has been chosen to enhance user involvement and acceptance of virtual training. In the last decades a lot of approaches have been developed for implementing computational fluid dynamics CFD in the computer graphics community.

One of the new approaches in fluid simulations is the discrete exterior calculus DEC. DEC uses well-centered meshes to describe its integration space.

His discrete formulation of the Navier-Stokes equations provides full control about viscosity and moreover an almost perfect preservation of kinetic energy.

We translate Mullens discretization into two dimensions and extend it to regular girds. We will discuss how to manage non-trivial boundary conditions. Finally we will analyze the results of Mullens approach, and analyze alternative methods to further improve those results.

We present a novel 3D geometry acquisition technique at high resolution based on structured light reconstruction with a low-cost projector-camera system. Using a 1D mechanical lens-shifter extension in the projector light path, the projected pattern is shifted in subpixel scale steps with a granularity of up to steps per projected pixel, which opens up novel possibilities in depth accuracy and smoothness for the acquired geometry.

Reaching beyond depth resolutions achieved by conventional structured light scanning approaches with projector-camera systems, depth layering effects inherent to conventional techniques are fully avoided.

We see two main benefits. First, our acquisition setup can reconstruct finest details of small cultural heritage objects such as antique coins and thus digitally preserve them in appropriate precision. Second, our accurate height fields are a viable input to physically based rendering in combination with measured material BRDFs to reproduce compelling spatially varying, material-specific effects. Virtual Surrogates of Cultural Heritage CH objects are seriously being considered in professional activities such as conservation and preservation, exhibition planning, packing, and scholarly research, among many other activities.

Although this is a very positive development, a bare 3D digital representation is insufficient and poor for fulfilling the full range of professional activities. The tool eases the user interaction, allowing inexperienced users without previous knowledge on semantic models or 3D modeling to employ it and to conceive it for the professional workflow on 3D annotations.

We illustrate the capabilities of our tool in the context of the Saalburg fort, during Roman times 2nd century AD , for the protection of the Limes on the Taunus hills in Germany.

Ein wichtiger Teil des Produktentwicklungszyklus ist die Optimierung der strömungs- oder strukturmechanischen Eigenschaften einer Komponente, die normalerweise in einem iterativen und sehr aufwändigen Prozess stattfindet.

In frühen konzeptionellen Designphasen müssen verschiedene Materialparameter sowie unterschiedliche Geometrien ausprobiert und verglichen werden, um zu einem für das spätere Produkt optimalen Design zu gelangen. Dieser zeitaufwändige Prozess begrenzt deutlich die Anzahl der Möglichkeiten, die analysiert werden können. In dieser Arbeit wird das Framework "Rapid CFD" vorgestellt, das es ermöglicht, schnelle Strömungssimulationen für die frühe konzeptionelle Designphase einzusetzen. Um eine solche Geschwindigkeit zu erreichen, wird die Berechnung und Visualisierung von zweidimensionalen Strömungen in Echtzeit kombiniert.

Das ermöglicht die interaktive Modifikation von Parametern und Randbedingungen und damit eine schnelle Analyse und Bewertung von unterschiedlichen Geometrien und eine frühzeitige Optimierung eines Bauteils. Die Berechnungen werden auf einem Standard-Desktop PC ausgeführt, sodass die Simulationsergebnisse im Graphikkartenspeicher bleiben und direkt zur Visualisierung verwendet werden können.

Für die Modellierung der Geometrie werden B-Splines verwendet, damit Benutzer lokal die Form durch einzelne Kontrollpunkte modifizieren können. Die Diskretisierung wird ebenfalls auf der GPU ausgeführt. Die Berechnung eines einzelnen Zeitschritts auch für Millionen von Unbekannten wird in Bruchteilen von Sekunden durchgeführt.

Obwohl diese neuartige Simulationstechnik noch nicht die hohe Präzision konventioneller Simulationen erreicht, ermöglicht diese Technik die Beobachtung von Trends und Tendenzen. We introduce an example-based photometric stereo approach that does not require explicit reference objects. Instead, we use a robust multi-view stereo technique to create a partial reconstruction of the scene which serves as sceneintrinsic reference geometry.

Similar to the standard approach, we then transfer normals from reconstructed to unreconstructed regions based on robust photometric matching. In contrast to traditional reference objects, the scene-intrinsic reference geometry is neither noise free nor does it necessarily contain all possible normal directions for given materials.

We therefore propose several modifications that allow us to reconstruct high quality normal maps. During integration, we combine both normal and positional information yielding high quality reconstructions. We show results on several datasets including an example based on data solely collected from the Internet.

Eine neue Tragfläche entsteht am Computer. Ist ihr Auftrieb tatsächlich besser als bei den herkömmlichen? Eine Computersimulation kann hierüber Aufschluss geben. Konventionelle Simulationen liefern die gewünschten Ergebnisse gewöhnlich erst nach mehreren Stunden oder Tagen.

Ein neues Verfahren liefert nun die ersten Simulationsergebnisse bereits in Echtzeit. Traditionally, 3D acquisition technologies have been used to record heritage artefacts and to support specific tasks such as conservation or provenance verification. These exercises are usually a one-off as the technology and resources required are cost intensive. However, there is a recent impetus on the creation of 3D collections to document heritage artefacts which are semantically enriched by using annotations.

A requirement of these solutions is the ability to support several representations of a heritage artefact recorded through time. This paper will propose an infrastructure to systematically enrich 3D shapes in a collection by using propagated annotations.

The results of this research have the potential to support heritage organisations in making their semantically rich 3D content available to a wider audience of professionals. VAST is a primary event, enabling on the one hand the interaction between Archaeologists, Cultural Heritage professionals and Information and Communication Technology ICT scientists, and on the other hand the dissemination of innovative solutions for the preservation and conservation of the world heritage.

VAST is a unique scenario for the demonstration of state of the art research in the areas of data acquisition and processing, metadata handling, and presentation, as well as for the communication and understanding of practitioners' experience in the field. The VAST platform boosts the establishment of 3D documentation and digital libraries as standard instruments for the integration of 3D digital assets within professional environments, fostering the scholarly research and the public engagement.

The VAST conference not only focuses on the development of innovative solutions, but it will investigate the issues of the exploitation of computer science research by the cultural heritage community. The transition from research to practical reality can be fraught with difficulty. The digital environment provides new opportunities and new business processes for sustainability, but with these opportunities there are also challenges.

This year the short and project papers will be complimented by their own dedicated panel discussions. The digitally documented world heritage is archived in databases or repositories, where collections of metadata, images, multimedia objects or nowadays even digital 3D artifacts can be stored and queried. Modeling and linking all this information is complex and involves refined categorizations and relations, which are usually accessed by either simplistic or overwhelmingly complex interfaces.

Hence, finding the right level of abstraction for a general interface is very challenging. This becomes even more demanding, if in addition to collection exploration, semantic enrichment is required.

This work focuses on the design and implementation of an integrated interface, in which four dedicated activities are combined and provided: This integrated interface can handle different kinds of multimedia objects, allowing for querying and annotating text, 2D images or 3D artifacts.

We present the software design of this interface and the corresponding underlying model in the semantic network. This work is a general step toward interfacing to 3D Linked Open Data.

Over the years, faster hardware - with higher clock rates - has been the usual way to improve computing times in computer graphics. Aside from highly costly parallel solutions only affordable by big industries - like the movie industry -, there was no alternative available to desktop users.

Nevertheless, this scenario is dramatically changing with the introduction of more and more parallelism in current desktop PCs. Multi-core CPUs are a common basis in current PCs and the power of modern GPUs - which have been multi-core for a long time now - is getting unveiled to developers.

Yet, its specific target - nVidia graphic cards only - does not provide any solution to other parallel hardware present. OpenCL is a new royalty-free cross-platform intended to be portable across different hardware manufacturers or even different platforms.

As an example application we use ray tracing algorithms. Three kinds of ray tracers have to be developed in order to conduct a fair comparison: At the end, a comparison is done between them and results are presented and analyzed showing that the CUDA implementation has the best frame rate, but is very closely followed by the OpenCL implementation.

Das dem vorliegenden Beitrag zugrundeliegende Projekt untersucht, inwieweit sich Ansätze der Geometrieanalyse aus industriellen Anwendungen auf das Problem der Definition von Wasser-Land-Grenzen adaptieren lassen. Ein Ziel dabei ist die Minimierung von Benutzerinterventionen. A personal computer can be considered as a one-node heterogeneous cluster that simultaneously processes several application tasks. This way, a high-performance heterogeneous platform is built on a desktop for data intensive engineering calculations.

In our perspective, a workload distribution over the Processing Units PUs plays a key role in such systems. This issue presents challenges since the cost of a task at a PU is non-deterministic and can be affected by parameters not known a priori. This paper presents a context-aware runtime and tuning system based on a compromise between reducing the execution time of engineering applications - due to appropriate dynamic scheduling - and the cost of computing such scheduling applied on a platform composed of CPU and GPUs.

Results obtained in experimental case studies are encouraging and a performance gain of The use of 3D shapes in different domains such as in engineering, entertainment, cultural heritage or medicine, is essential for representing 3D physical reality. Regardless of whether the 3D shapes are representing physically or digitally born objects, meshes are a versatile and common representation for the 3D reality. Nonetheless, the mesh generation process does not always produce qualitative results, thus incomplete, non-orientable or non-manifold meshes frequently are the input for the domain application.

The domain application itself also demands special requirements, e. Moreover, the processes applied on the meshes e. These operations need to be robust, hence the neighboring information can be consistently updated, during the dynamic changes. Dealing with this mesh diversity usually requires dedicated data structures for performing in the given domain application.

This paper compiles the considerations toward designing a data structure for dynamic meshes in a generic and robust manner, despite the type and the quality of the input mesh. These aspects enable a flexible representation of 3D shapes toward general purpose geometry processing for dynamic meshes in 2D and 3D. Dense 3D Reconstruction of environments is important for various applications like augmented reality, artefact digitization and object classification.

Object classification in particular allows for scene understanding. This work proposes the development of a pipeline for image based 3D reconstruction and object recognition. The 2D images under consideration are the inside out images of the interior of a room.

A dense 3D reconstruction allows the description of the room as point clouds on which the object recognition algorithms are implemented. To allow for flexibility in terms of image acquisition methods, the algorithm is robust to the type of image input as well as the number of images. The reconstructed scenes are then acted upon by the 3D feature extractor and the features are compared with pre-trained classifiers from a database to carry out object recognition.

The pipeline has been developed to allow for different types of Multi-view Stereo input images. While plane images allow for cheap equipment, spherical and cylindrical panoramic images allow for ease in image acquisition. Iterative Closest Point algorithm allows for integrating the depth maps to generate the mesh model. The pipeline also allows for an input generated using the Microsoft Xbox Kinect. We also study the prospect of using 2D to 3D feature correlation to find objects in the 3D generated model of the room from a 2D image of that room.

This work also shows the results of a comparative study undertaken between the different possible methods to complete the task. We also study the different image geometries to further explore the invariance to camera models. Finally the pipeline has been integrated in the Rapid Prototyping Environment RPE framework of the Department Interactive Engineering Technologies as a plug-in to provide additional functionality.

Digital Mock-Up DMU is a widely introduced technology to virtually investigate geometrical and mechanical product properties. Enhancing DMU with functional aspects, considerably more insight in product properties can be gained.

To enable FunctionalDMU two main tasks have to be solved: In this paper we present an independent and open approach to a FunctionalDMU framework including co-simulation. Starting with proprietary and natively given behavior and geometric models in formats like JT , we wrap the behavior models into SysML to enable data exchange on an agreed and standardized format.

The native behavior models still are executed in the corresponding simulators. The simulators are linked to the FunctionalDMU framework using a wrapping approach. During simulation a simulator coupling algorithm controls the simulation processes. A dedicated visualization environment enables the user to interact with the simulation, i. This paper introduces the components of the FunctionalDMU framework and illustrates the approach with an application example.

Bein, Matthias; Fellner, Dieter W. We present a genetic algorithm for approximating densely sampled curves with uniform cubic B-Splines suitable for Combined B-reps. A feature of this representation is altering the continuity property of the B-Spline at any knot, allowing combining freeform curves and polygonal parts within one representation. Naturally there is a trade-off between different approximation properties like accuracy and the number of control points needed.

Our algorithm creates very accurate B-Splines with few control points, as shown in Fig. Since the approximation problem is highly nonlinear, we approach it with genetic methods, leading to better results compared to classical gradient based methods.

Parallelization and adapted evolution strategies are used to create results very fast. We present a physically based interactive simulation technique for de formable objects. The implementation is simplified, as spatial derivatives and integrals of the displacement field are obtained analytically avoiding the need for numerical evaluations of the elements' stiffness matrices. We introduce a novel traversal accounting for adjacency in order to accelerate the reconstruction of the global matrices.

We show that our proposed method can compensate the additional effort introduced by the co-rotational formulation to a large extent.