at the hospital participating in the research from november 2008 to november 2009, which were selected in the database of the multicenter research. For analysis, the data were grouped in accordance with Minayo (2010). Results: it was obtained two themes as a result; puerperas’ perceptions on the obstetrical care received in the center and health professionals in the processof parturition. Conclusion: it was evidenced that adolescent mothers that received the care they deemed ideal and committed to them, experienced the parturition process in a more pleasurable way.
Payment of the broker's commission
Figure 4. Activities carried out in the processof concluding insurance contracts (source: own study)
The processof concluding insurance contracts has a multi-directional impact on the financial performance of insurance companies. This applies both to activities carried out in the process, as well as to the costs associ- ated with the implementation of the process. The values of assets and liabilities change, as do the positions of the insurance technical account and of the profit and loss account. This causes changes in the financial performance of insurance companies. A model of fi- nancial flows in the processof concluding insurance contracts is shown in Fig. 5.
2008 to 2009 (Chart 3). Namely, the beginning of the global economic crisis has had a strong impact on the Montenegrin economy, which was also the case with other countries of similar characteristics in the neighbourhood.
Various industrial production values continued after 2010. Nevertheless, 2014 and 2015 brought about a new growth in industrial production that should be con- tinued in the future with the achievement and exceeding the values reached be- tween 2004 and 2008. Data presented in Charts 1-3 indicate that the level of GDP per capita in the entire analysed period (except in 2009, 2012 and 2015) increased. Declines in the aforementioned years can be attributed, primarily in 2009, to the negative effects of the global economic crisis that lasted until 2015. The economy of Montenegro is still in the recovery period, which will require thorough and effective reforms. Furthermore, the increase in GDP in the period 2001-2014 was followed by a decline in the share of industry in GDP (except in 2007, 2008 and 2013) and by increasing the added value of the service sector. Based on the above- mentioned data it is possible to get an initial impression on the deindustrializa- tion process in Montenegro. Namely, it is evident that in the period 2001-2014 (with the exceptions mentioned above), it had characteristics of a natural process resulting from economic development, which is the characteristic of developed economies. Furthermore, in 2015 and 2016, GDP growth was accompanied by an increase in industry value added. Such a situation in some way points to the beginning of the processof re-industrialization, i.e. industry development in changed conditions, which is also a priority set by the key development strate- gies of Montenegro, which are explained in more detail below in the research. In
University of Stellenbosch Business School (USB), Bellville, South Africa
Suggested Citation: Mostert, F. J. (1987) : An empirical study of the concepts processof
manufacture and similar process, South African Journal of Business Management, ISSN 2078-5976, African Online Scientific Information Systems (AOSIS), Cape Town, Vol. 18, Iss. 1, pp. 1-4,
As a next step, insights from cognitive psychology and an existing categorization were used to derive a categorization of the interactions with the modeling tool CEP. In total five model interaction types have been identified, which are composed of the Deployment Steps, Adaptation Steps, Positioning Steps, Rejection Steps and the Repetition Steps. Consequently the recorded model interactions were used for PPM evaluations and with this new categorization all PPM instances and its corresponding model interactions have been categorized. These model interac- tion types are machine-readable, i.e. that they are semantically annotated and in a form which allows the used tool CEP to deduce (described in Section 1.2.2). All model interaction types were detected fully automatically with the exception of the Repetition Steps. In this case we speak of semi-automatic, because the logged ProcessofProcess Modeling (PPM) (which is created fully automatically) needs to be assessed afterwards by a person regarding repeat aspects. In short, a two-stage approach has been developed, where first all automatically identifiable model interactions were detected, in order to subsequently detect all Repetition
For investigation of space charge compensation process due to residual gas ionization and the experimentally study of the rise of compensation, a Low Energy Beam Transport (LEBT) system consisting of an ion source, two solenoids, a decompensation electrode to generate a pulsed decompensated ion beam and a diagnostic section was set up. The potentials at the beam axis and the beam edge were ascertained from time resolved measurements by a residual gas ion energy analyzer. A numerical simulation of self-consistent equilibrium states of the beam plasma has been developed to determine plasma parameters which are difficult to measure directly. The temporal development of the kinetic and potential energy of the compensation electrons has been analyzed by using the numerically gained results of the simulation. To investigate the compensation process the distribution and the losses of the compensation electrons were studied as a function of time. The acquired data show that the theoretical estimated rise time of space charge compensation neglecting electron losses is shorter than the build up time determined experimentally. To describe the processof space charge compensation an interpretation of the achieved results is given.
The denition of conceptual scales is a key step of the development process and most relevant with regard to reuse. In this step, provisional query themes which are formulated in the rst step are rened to conceptual scales. Depending on the amount of theory that the domain expert has on the attributes, we have to consider two cases: theory- and data-driven development of the conceptual scales.
Informationssystem Deutsch). The dictionary elexiko can be characterized as an online dictionary under construction, a
so called “Ausbauwörterbuch” (cf. Schröder 1997: 60; Klosa 2013).
The lexicographers of elexiko compile a reference work that explains and documents contemporary German. After
publishing a complete list of headwords on the Internet in 2003, the dictionary was filled with sense independent information for each headword which was automatically or semi-automatically generated from the underlying corpus (for example the frequencies of the headwords), the elexiko-corpus1 (cf. Storjohann 2005). The next step was the publication of 250 headwords, which were defined as the demonstration module (Demonstrationswortschatz) that has been fully lexicographically described. Following the latter, we are currently working in the module “Lexikon zum öffentlichen Sprachgebrauch” (dictionary on public discourse). It contains entries selected mainly according to their (high) frequency
The first prerequisite could be met (3.1.2) however, FRET experiments revealed that the scRM6 extension components were not stretched out maximally (3.1.3), which conse- quently means that the distance between the fluorophore and the DNA axis could not successfully be increased in the fusion protein. Thereby none of the scRM6-EcoRV fu- sion protein variants was appropriate for the direct visualization of rotational motion. But even with an optimal spatial expansion of the scRM6-EcoRV fusion protein, a direct visualization would not have been possible. The spatial resolution was considered as a problem which can be circumvented with the fusion protein (Figure 23). Theoretically, nanometer precision is possible which would have allowed distinguishing a transversal position change of 20 nm using super-resolution techniques . Effectively, Brownian fluctuations of the DNA molecules take place in the same direction and decrease the spa- tial resolution by one or two orders of magnitude above the theoretical limitations . An additional problem is the temporal resolution (Figure 61). The transverse position of the fluorescent spot is supposed to change by 20 nm after the enzyme turned 180° around the DNA molecule (half-rotation). Therefore the questions arises how much time the en- zyme needs to perform such a half-rotation. This can be calculated using the equation:
measuring of simplicity depends on the chosen language (Sober, 2002). This effect can be formulated by quoting Goodman’s "new riddle of induction" (Goodman, 1983). Consider a simple and a more complex proposition: (1) all emeralds are green and (2) all emeralds are green up to time t (which has yet to pass) and blue afterwards. We now define "grue" as green until t and blue afterwards and, accordingly, "bleen" as blue until t and green afterwards. Thus, the two statements can be rephrased as (1.1) all emeralds are grue until t and bleen after that and (2.1) all emeralds are grue. We see that the simpler proposition translates to a more complex one whereas the complex second statement is sim- plified, which is caused by the transformation of description. The same concept applies for MAGER: the simplification of models is supported by the adoption of a new "language" which is based on the transformed state variables. The benefits of a detachment from well-known description concepts have also been discussed in the context of latent variable models. For example, Malaeb et al. (2000) note that "thinking only in terms of directly observable variables confines our horizons and limits our assessment of complex systems". However, the reduction processof MAGER not only produces simple models but also accounts for parsimony as the tradeoff between simplicity and accuracy is an integral part of the model learning process. Because of the method being data-driven, the resulting models are most likely to provide insights into the dominant processes which are only derived from the information inherent to the available data.
a static image. Only a dynamic image could do so. So, the R Iv-BA is un- equivocally a content-based process-requirement.
Under which conditions can you form an intention to visit Bratis- lava? No doubt, there will be a plethora of conditions. Many will be psychological in nature. Let me focus on one in particular. I assume you cannot form an intention you already have. That is, you can form an intention only if you are in a situation in which you do not intend to visit Bratislava. Compare this with the processof raising your arm, for example. You can undergo the processof raising your arm only if your arm is not already raised. In short, your arm must be un-raised to raise it. Or you cannot drive to New York if you are already in New York. You must be out of New York to drive into it. The same, I assume, will hold for an intention. You can only form it if you do not already have it.
The activated sludge process is the most extensively used biological process for municipal and industrial wastewater treatments . It is a processof utilizing microorganisms to convert organic matter into carbon dioxide, water, and inorganic compounds . There are a lot of different bacteria and protozoa in activated sludge, the sum of all organisms and their life together in this ecological system is called “biocoenosis”. This biocoenosis describes the interaction of all organisms in the system. It is subjected to a continuous change in the amount and the composition of the bacteria as an adaption to varying substrates in their feed. Different process parameters can lead to a different composition. As an example, also the temperature dependency of bacteria could be mentioned. For different temperatures, different bacteria can survive and/or have their thermal optimum .
Connecticut automotive component manufacturing
company. Singh (2011) investigated the process capabil- ity of polyjet printing for plastic components. In his ob- servation, he traveled the improvement journey of the processof critical dimensions and their C pk value attain- ment greater than 1.33, which is considered to be an in- dustrial benchmark. In recent studies conducted by Lin et al. (2013), they focused on turbine engine blade in- spection as it is a key aspect of engine quality. They elaborated on the accurate yield assessment of processes with multiple characteristics like the turbine blade manufacturing process. Kumaravadivel and Natarajan (2013) dealt with the application of the six-sigma
Objective: We aimed to understand how culture influences in the processof women delivery. Eight women were interviewed. Method: A qualitative study done at a teaching hospital and basic health unit, in the year 2011. The interviews were analyzed and interpreted according to the Analysis of Thematic Content. Results: Showed that the positive meaning conveyed by the women who cohabit with the interviewed, provided an enriching delivery and influenced in the preference for the natural labor. The participants who received negative comments felt fear, anxiety and insecurity during the birth experience. Conclusion: We understand that culture influences in labor of women. Descriptors: Anthropology, Nursing, Delivery, Obstetric, Women’s health.
The intention of this document is to describe the Processof a Change Proposal (CP) for the Common Critieria Version 3.1 ( to ) in conformance with . A CP is a means to express any deficiencies or suggestions for improvement of the CC. Such a CP can e.g. be of an editorial nature (e.g. misspelling of word three in paragraph five of the CEM, ) or can be of a more technical or general scope.
The target of every company is to satisfy customer demands. Especially the clothing industry has to serve individual customer requirements. Textile products always have been and still are the defining attributes of people’s appearance. Consumer’s demands towards commercial clothing companies have been changing rapidly during the recent years. Two global megatrends have supported this change: Individualization and digitalization. Individualization created demand for frequent collection changes, while still keeping availability high. Digitalization supported the quick distribution of new trends and forced a higher amount of request during peak periods .
The investigated process is supplied with conditioned electrolyte by a peripheral module. The module comprises a number of filters and pumps which feed the electrolyte solution to the machining process and back to the supply module where the electrolyte is cleaned and refreshed. The pumps have different operating characteristics, e.g. the feed pump runs constantly to provide the process with the needed electrolyte flow rate. Other pumps show a cyclic operating behavior depending on certain time schemes or the filling level of certain tanks. Therefore, a long-term measurement is carried out to determine the average power consumption of the module depending on the electrolyte volume flow rate. Fig. 5 displays the power consumption of the electrolyte supply module for the volume flow rates (Q) between 0 and 40 l/min at a pressure of 5 bar. It should be noted that higher feed rates result in higher generator current and thus lead to a decrease in the frontal gap. This results in a decrease in the volume flow rate of the electrolyte. The volume flow rate for the three parameter sets amount to 16.9 l/min (P p =3.57 kW),
In general, phases or sub-processes within the entire annotation process can be de- tected, which are similar in the different described cases and thus can be mapped mu- tually. First of all, preliminary activities can be identified that refer to the scheduling or planning of the annotation process and the configuration of the applied system. More- over, further workflow sub-processes can be derived that refer to the actual tasks of annotation. These include the definition of parts that are going to be annotated on the one hand, and acts of information attachment on the other hand. Some of the in- vestigated workflows (or models) make clear that these acts of annotation are often accompanied by data verification, that is, users constantly search and browse own an- notations or results from co-annotators. This can lead to the revision and modification of results and, consequently, to jumps and iterative loops within the process. Addition- ally, it is also exploited in order to train annotators or, in the case of machine-learning based processing, also systems. Finally, the resulting documents of annotated media files are exported for further processing or published in diverse distribution formats. Thus, the results of an annotation process may have impact on following projects. The different annotation workflows described in the previous sections show that it is not possible to declare a “best practice ”of annotation. Although commonalities could be demonstrated that allow a summarization of activities into process phases, the execution order of sub-processes may differ strongly to some extent. For instance, the exemplified workflows show that automatized approaches mostly evince pipeline- like structures, while workflows in which user interaction is required provide more networked task interconnectedness. Thus, even though different processes comprise identical tasks, the positions within the entire workflow may differ. A representative example is the segmentation task, which is mostly executed during the factual annota- tion phase, but is also displaced to preliminary activities in some cases. This concludes that the establishment of a general process model for annotation requires a suitable de- gree of abstraction. Second, it justifies the idea of permitting users (or administrator) to individually specify the annotation process by means of workflow management ap- proaches, which constitutes one of the main contributions of this thesis.
lubricant. It includes the process power as well as effects of the lubricant itself.
In order to estimate these effects of the lubricant supply inside the cutting process, the temperature near the cutting zone had to be measured. Therefore, a thermocouple has been installed into the outer coated cemented carbide cutting insert of the drilling tool with a diameter of 22 mm, as seen below. Both the drilling tool and its inserts have been provided by KOMET GROUP. For drilling a tool of the type KUB Pentron and corresponding inserts W80 20010.082730 SOGX 07T208-01 BK2730 with a TiAlN PVD coating have been used. The following Figure 2 shows the position of the thermocouple, which is located approximately 0.1 mm to the main cutting edge. For a better heat transfer from the insert to the thermocouple a thermal compound as been applied additionally.
Our research starts with the assumption that a complete and freely analyzable event log is usually not available. We regard this scenario as the most common one. Thus, one of the major aims of our research is to harvest process execution knowledge. This enables the assembly of a process execution log. This log is built up independently from the existence of information systems that are (at least partly) executing the processes. We developed a special software, a Process Observation (PO) tool, that can be envisioned as a tool that permanently runs on the computers ofprocess participants that asks the process participants “What are you doing right now?”. The participants then have to describe what they are doing. Here, the user does not need any process modeling skills. This is also one very important prerequisite since we assume that just few process participants do show process modeling skills. The recorded data is used by PO to mine for process models. Of course, this process information can be enriched and complemented by event logs from information systems that are involved in process execution. Gathering process execution information comes with the cost that process participants have to record what they are doing. Of course, this means additional work for the process participants. Therefore, PO must offer a stimulus that motivates process participants to work with PO. This stimulus is put into effect by a recommendation service. PO continuously analyzes available process log data to guide the process users. This means, it suggests process steps, documents or tools that the user most probably performed or used. We have experienced that this feature is especially important for users that are still not too familiar with the application; they are thankful that the PO tool recommends possible process entities. This dynamic recommendation service becomes more and more reliable the more process instances have been executed under the guidance of PO. The execution of first instances of a process will therefore not considerably be supported. The effect becomes apparent when a couple ofprocess instances have been executed.