Az Európai Unió lisszaboni stratégiájának szegénységcsökkentési célja súlyos kudarcot vallott, és az Európa 2020 stratégia szegénységi célkitűzése nem látszik elérhetőnek. Mindkét stratégia a „szegénységnek kitett” mutató változatain alapul, melynek helytelen és félrevezető neve van. Elméletileg és empirikusan demonstráljuk keresztmetszeti, idősoros és panelkointegrációs eredmények alapján, hogy a szegénységnek kitett mutató alapvetően a jövedelemegyenlőtlenséget, nem pedig a szegénységet méri. Számításaink azt mutatják, hogy figyelembe véve a várható gazdasági növekedésnek az anyagi nélkülözésre és az alacsony munkaintenzitásra gyakorolt pozitív hatását, a Gini jövedelemegyenlőtlenségi együtthatónak minden egyes EU-országban 3,5 ponttal kellene csökkennie az Európa 2020 szegénységi célkitűzés eléréséhez, ami valószínűtlen. A nemzeti szegénységi küszöbök közötti hatalmas különbségek az EU egészére vonatkozó szegénységnek kitett mutatót értelmetlenné teszik. Megbecsüljük az uniós szintű jövedelemelosztást, amely segítségével az EU egészére kiterjedő szegénységi mutatókat számolunk. Az EU tagállamok közötti politikai megállapodás a szegénység csökkentésére vonatkozott, nem pedig a jövedelemegyenlőtlenségek csökkentésére. Bár felhozhatóak indokok az alacsonyabb jövedelmi egyenlőtlenségre való törekvés mellett, de politikai egyetértésre lenne szükség az egyenlőtlenségi cél és az ennek megfelelő gazdasági és szociális politikák meghatározásához.
Th e improvement of fuel effi ciency in the freight transport sector accounts for 15.5 % of the total saving potential. Trucks will face slower energy effi ciency improvements by technologi- cal development than passenger cars, but starting on a higher level of fuel effi ciency. Nevertheless measures like driver train- ing, effi cient air-conditioning and ensuring the correct tyre pressure will make still an important contribution to overall energy effi ciency. Furthermore a variety of technological op- tions can be implemented to decrease the fuel consumption of trucks (Bates et al 2001). Actions directly targeting heavy duty vehicles are not explicitly mentioned in the EEAP. Only a footnote in the Analysis of the Action Plan indicates actions planned by the European Commission aiming at heavy duty vehicles (SEC 2006). However, concrete measures and timeta- bles for improving energy effi ciency are not given. In contrast to heavy-duty vehicles, itis announced to include light-com- mercial vehicles into the future strategy concerning vehicle fl eet emission targets (SEC 2006). Th e optimisation of logistics and traffi c management systems can result in decreasing fuel consumption for heavy-duty and light-commercial vehicles, too. Both measures are not explicitly mentioned in the context Table 5: Energy Saving Potential in the Transport Sector
experiences, however, the success rate is very high. The snowball sampling technique is theoretically a linear expansion from one point to another, including an exponentially increasing population. The successive approach starts with a larger population of periphery persons but does not increase exponentially. Usually, one periphery person comes up with one person of thehard-to-reach population. In my case this meant that I never asked a gang member to introduce me to another gang member. The successive approach is useful for researchers whose field research is limited to a short period of time, because itis based on parallel processes—that is, a high number of periphery persons are contacted at the same time and asked to facilitate contact with thetarget population. In this two-step approach periphery persons are at the center of two relationships of trust: with members of the populations of interest and with researchers, who trust that the periphery persons will not put their lives in danger by setting up dubious meetings. The bond of trust established between a researcher and a periphery person can enhance the researcher's security, as the periphery person develops a kind of responsibility to ensure the researcher's safety. A relationship between the researcher and the periphery person based on trust can enhance security based on the assumption that the periphery person will not place the researcher in harm's way by making bad decisions (e.g., arranging dubious meetings at night). More specifically, I noticed that the periphery persons felt responsible for my well-being and advised me when a situation was safe ("don't worry"), when to pay attention ("be careful"), when not to go to a meeting or leave a scene ("it's time to go"). In this vein, the successive approach creates a network in which a researcher can work with a certain level of personal safety.
substantial decline in the number of self-employed in absolute terms, because of the total decrease in employment, the share increased to 37.0% in 2013, compared to 16.5% in theEU and dropped to 34.1% in 2016 vs 15.8% in theEU 9 . As shown in Table 1, the share of individuals living with a self-employed head tothe total population dropped consecutively from 2007 to 2013 and then to 2016, with the exception of the “self-employed without employees in agriculture” of whom the share rebounded in 2016. Although, the mean income of the population living in households with a “self-employed with employees” head declined both in real terms and in relation tothe mean income, it remains much above the mean income of the population in all years while, itis 36% higher at the end of the observation period. On the contrary, the mean income of the population living with a “self-employed without employees” head that works in agricultural activities is much below the mean income in all years, but significantly improved between 2013 and 2016. Also, unlike the rest of the population, this population sub-group is likely to have in-kind incomes in the form of consumption of own agricultural production. Members of households headed by “self-employed without employees in non-agricultural activities” have mean income closer tothe national average at the beginning and end of the observation period, while temporarily improved its relative position at the peak of the crisis, despite the fact that in real terms presents dropping incomes in both sub-periods. The shares of population groups with head employed in either private or public sector decrease between 2007 and 2013. Yet, the share increases substantially between 2013 and 2016 for the private sector employees (reaches 19.9% in 2016), and slightly for public sector employees (11.5%), in line with the decline in the aggregate unemployment rate. The relative income of individuals living in households with a private employee as head is close tothe mean income, while for households with head being an employee in the public sector is much higher – almost a quarter above the national average. This evidence runs contrary tothe claim often made in the public discourse that although public sector employees did not experience unemployment, they paid a very high price since their salaries were reduced far more than private sector salaries. The change of incomes for both groups is large and negative in absolute terms between 2007 and 2013, and slightly improves between 2013 and 2016.
3. Whyis there agglomeration, and isit bad?
In this section, two processes of agglomeration are discussed. In the first, I focus on the formation of clusters of firms in an economy whose markets are supposed to be unaffected by clusters' size, p resumably because they are small relative tothe rest of the economy (Section 3.1). In the second, I will shift to general equilibrium and will assume that both workers and firms are mobile, thus generating market effects at the level of the whole economy (Section 3.2). In both settings, consumers and firms can locate in one region only, which stands for the fundamental indivisibility that appears at the level of the person or of the plant. Also common to both settings isthe fact that the emerg i n g locational configuration isthe outcome of the interplay between centrifugal and centripetal forc e s . The most typical feature of the analysis is that the two processes are self-reinforcing. In particular, we will see that, once transport costs (broadly defined in order to include all the impediments to trade) have sufficiently decreased, regions that were initially similar end up with very contrasted production patterns. Hence, divergence instead of convergence should be expected as integration develops. Yet, as will be discussed in the concluding section, further decreases in transport costs may well foster the dispersion of some activities due to factor price differentials.
4 German Nutrition Society (DGE): Ernährungsbericht 2008, availa- ble at www.dge.de/modules.php?name=News&file=article&sid=914. 5 Robert Koch Institute: Studie zur Gesundheit von Kindern und Ju- gendlichen in Deutschland (KIGGS Study), available at www.kiggs.de. 6 Valid data are not available for the consequential costs of over- weight, since illnesses in health reporting are not recorded statistical- ly according to their causes. The costs provided in this report are only an estimate cited by the BMELV (German Federal Ministry for Nutriti- on, Agriculture and Consumer Protection) and BMG (German Federal Ministry of Health) in Gesunde Ernährung und Bewegung – Schlüssel für mehr Lebensqualität. 2007, 2. These estimates are based upon a study by the BMG: Kosten von ernährungsabhän gigen Krankheiten in der BRD im Jahre 1990. Volume 27, 1993. At that time, related costs were calculated to be 42.7 billion euros, as cited by the German Soci- ety for Nutritional Medicine: Newsletter 1: Ernährungsmedizin heute. 2005 and BMG: Daten und Fakten zu Ernährung und Bewegung Prä- vention. The German Institute for Nutritional Medicine and Dietetics (D.I.E.T) calculated the data for 1990 as being just as high and came up with costs totaling 148.5 billion Deutschmarks for nutrition-rela- ted illnesses in 2001.
The difficulty of achieving integrated risk management can be illustrated by the simple case of commodity producers based outside the U.S. Since commodities are typically priced in USD, these firms tend to have two clearly identifiable exposures that make up a large portion of their overall cash flow volatility: the exposures tothe commodity price and the exchange rate between USD and its home currency (in which a high portion of costs are normally incurred and in which performance is normally measured). This seems to be the ideal setting to achieve an integrated form of risk management – one that considers the risk profile of both these exposures and the way they co-vary over time. However, not all commodity producers achieve it. In fact, in many cases the management of FX risk, typically under the discretion of the treasurer, is still considered a distinct task from price exposure management. This separation means that, in the jargon, FXRM is a “silo”. Often, FX exposures are also not integrated with other closely related macroeconomic risks, such as interest rate exposures. It can even be argued that there are silos within the FX silo. For example, FX exposures related to cash positions often are looked at separately from the exposures arising from, say, accounts receivables and payables.
Thus, itis quite likely that candidates in the control group (in particular the borderline defeated) serve as council replacements. If actual political experience is what matters for income and political career prospects, itis thus sensible to define treatment as actually having served in the council, rather than being elected into the council on election day. If any regular council member resigns early in the election period and a candidate in the control group thereby gets a permanent seat in the council, and/or if the borderline elected isthe one who resigns, the variation in treatment status— defined in this way—will, therefore, be fuzzy at the threshold at rank ? = 0. Fortunately, at least for the 2002 and 2006 elections, there is information on early resignations and effective replacements that can tell the extent to which the treatment effects obtained from running the regression in (1) underestimate effects of being de facto treated (i.e., actually having served in the council). If borderline elected candidates are defined as having de facto been treated if they did not resign during the first year after the election date, and if defeated candidates are defined as having been de facto treated if they overtook someone’s permanent council seat at least 300 days before the next election, 32 then, according tothe 2002 and 2006 data, 95% and 40% of all borderline elected and defeated were de facto treated, respectively. The corresponding percentage among candidates ranked −2 is around 20%.
German Institute for Economic Research (DIW Berlin)
Suggested Citation: Wrohlich, Katharina (2017) : There is a lot left to do toreach gender
equality in Germany, DIW Economic Bulletin, ISSN 2192-7219, Deutsches Institut für Wirtschaftsforschung (DIW), Berlin, Vol. 7, Iss. 43, pp. 427-28
As for theEU, it faces a credibility issue with regard tothe exact terms that it grants the UK: it cannot simply ac- cept whatever member states or third countries want to do at the expense of the union and the European project. The UK’s reported desire for a “Norway-plus” agree- ment, which amounts to cherry-picking in the internal market, is a case in point. Another isthe UK’s special status that EU leaders (without any backing from their citizens) granted Prime Minister Cameron to induce him to support Remain rather than ﬁ ghting for Leave, as he said he was prepared to do; ultimately, these conces- sions turned out to be of no avail other than setting a dangerous precedent and damaging the EU’s credibility. That iswhy access tothe single market, which is at the centre of what theEU does, needs to come with clear conditions and rules, safeguarding all of the four free- doms. This is valid for Norway and should be for the UK as well. Indeed, Switzerland may well be about to lose its access tothe single market in the near future due to its failure to respect the free movement of persons. 18
4.2 Annotating the Tu¨ Ba-D/Z in the TIGER Annotation Scheme
In order to conduct a meaningful comparison of the impact of the two different annotation schemes on parser output we extracted a test set of 100 trees from the Tu¨Ba-D/Z treebank and manually annotated it following the guidelines in the TIGER stylebook. Due tothe high expenditure of time needed for a manual anno- tation we were able to create a small test set only. To make up for the restricted size we carefully selected our test set by subdividing each of the 44 samples from the Tu¨Ba-D/Z treebank into five subsamples with 100 sentences each, and picked the subsample with a sentence length and perplexity closest tothe mean sentence length (17.24, mean: 17.27) and mean perplexity computed for the whole tree- bank (9.44, mean: 9.43). This assures that our test set, despite its limited size, is maximally representative for the treebank as a whole.
Religious groups have often sought exemptions from practices in which states intervene by promulgating a law to be applied neutrally tothe rest of society, argu- ing either that the law requires them to do things not permitted by their religion or that it prevents them from doing acts mandated by it. For example, Sikhs demand exemptions from mandatory helmet laws and police dress codes to accommodate religiously required turbans. Elsewhere, Jews seek exemptions from Air force regu- lations to accommodate their yarmulkes. Muslim women and girls demand that the state not interfere in their religiously required chador. Principled distance allows that a practice that is banned or regulated in one culture may be permitted in the minor- ity culture because of the distinctive status and meaning it has for its members. Religious groups may demand that the state refrain from interfering in their prac- tices, but may equally demand that the state interfere in such a way as to give them special assistance, the argument being that this will enable them to secure what other groups are able to routinely get by virtue of their social dominance. Principled distance may grant authority to religious officials to perform legally binding mar- riages, to allow religions their own rules or methods for granting divorce and rela- tions between ex-husband and ex-wife, their ways of defining a will or laws about post-mortem allocation of property, arbitration of civil disputes, and even methods of establishing property rights.
The European Commission has set a target date of 2025 for Western Balkan EU accession, while also outlining a broader new strategy which includes Brussels taking a more active role in solving political disputes in the region, and upgrading infrastructure as part of the Berlin Process. We welcome these moves: economic underdevelopment in the region is closely tied to political fractures. Aside from resolving political conflicts, improved governance in the region will also be necessary. In terms of meeting economic accession criteria, the region faces a host of challenges, but we believe that a focus on upgrading infrastructure and developing a much bigger and more competitive industrial base should be the priorities. While the economic influence of third parties in the region is not as significant as often portrayed, this is not guaranteed to last, particularly in the case of China, which is set to increase its economic presence in the Western Balkans in the coming years. Even if the region takes a great leap forward towards theEU, there are other barriers in the way which could also hold back accession. Nevertheless, while the 2025 target represents a highly ambitious best-case scenario, it could serve as a powerful incentive for countries in the region to speed up their reform agendas. We do not completely rule out at least Montenegro and Serbia joining the bloc by 2025 or shortly thereafter.
5.4.2. Hardware Programming & Design
In Section 5.3, some applications of functional programming for hardware program- ming and design have been quickly discussed. One of which was the Bluespec programming language, a subset of Haskell which is used as a high-level hardware description language. Once the design is complete, the Bluespec compiler generates synthesizable Verilog. This apporach has three impactful advantages. Firstly, using a functional language allows to design the hardware on a high abstraction level. Itis defined what the hardware is supposed to do instead of defining every circuit. This also enables programmers with only little knowledge about hardware to design their own circuits. Secondly, hardware designs generated from functional code turn out to be performing better than corresponding ones design using Verilog or VHDL . Lastly, the mathematical properties of programs written in functional languages allow different kinds of verification. In addition, the strong type systems ensure program correctness.
dividend” disbursed at the end of the year since fiscal targets were overshot (EUROMOD, 2016). Yet, due tothe worsening condition in the Greek economy in 2015 and the imposition of capital controls, the provision of a “social dividend” was discontinued in 2015. Policy changes in 2016 had a progressive effect on the income distribution mainly driven by the two means-tested benefits (food stamps and housing allowance) that were introduced in 2015. Further, the tax reform that took place in 2016 changing the income brackets and tax rates of the personal income tax schedule had a positive effect on the incomes of the poorest income deciles. On the other hand, changes in social insurance contributions, mainly for the self-employed had a regressive impact (EUROMOD, 2017). Arguably, the most important policy change related topoverty alleviation during the years of the crisis was the introduction of system of Minimum Income Guarantee, first as a pilot project (2014-2016) and, then, as a full-blown program from 2017 onwards (i.e. outside the period covered in the paper). 12
Our results are in line with previous findings of non-spatial preparatory activity in PMd and PPC in conditions where only the movement goal or the effector to move was known (Beurze et al., 2007). The role of such non-spatial activation remains widely unclear, but potential explanations have been put forward for findings in macaques. For instance, when the movement goal is still underspecified, a higher magnitude of non-spatial preparatory firing in the macaque PRR is significantly correlated with shorter reach reaction times (Snyder et al., 2006). Snyder and colleagues (2006) argued that the elevated baseline of PRR activity in underspecified conditions is useful for the rapid development of PRR firing rates that represent thereach goal, once itis specified. The more rapid movement goal representation in PRR may in turn cause a faster transfer of the spatial information over tothe arm muscles, and thereby lead to shorter reaction times. Since thereach goal is already represented in PRR during a delay in conditions when the movement goal is specified, the stimulus to wait for isthe go cue. In their experiment, this was the offset of the fixation point which is not processed by the PRR populations with peripheral response fields encoding thereach goal (Snyder et al. 2006). That may be whythe magnitude of PRR activity and reach reaction times are not correlated in conditions with an early specified reach goal. A similar mechanism may account for our findings. The posterior SPL7 showed non-spatial activation in underspecified conditions, which still occured at a weaker level than in specified conditions. This may guarantee a rapid specification of thereach goal once the context rule (pro or anti) is presented. The posterior SPL7 areas and the aIPS may thus be in a “prepare-to-prepare” state rather than in a “prepare-to-move” state.
Francq et al. [ 4 ] obtain in their simulations that “...the finite-sample performance of the VTE seems quite satisfactory” and that the “experiments on daily stock returns do not show sensible differences between the estimated parameters of the two methods.” The results of our simulation exercises do not conform to these conclusions. For most parameters and associated predictions, with a notable exception of unconditional variance and its functions, the bias under variance targeting is larger, at least in median terms, sometimes by a few-fold. This tendency is typically exacerbated for a heavier-tailed distribution of standardized returns, while the distributional asymmetry has little or a moderate impact. A larger sample size also has a more favorable effect on estimation precision when no variance targeting is used. For the unconditional variance mentioned above as an exception, the estimator dispersion may be quite high because of a long right tail when no variance targeting is employed (see the numerical example above). However, the median bias can be so much larger under variance targeting that statistics explicitly containing (in particular, long-run VaR predictions) this parameter may exhibit a very big degree of bias. Some effects further intensify if one uses ML based on a leptokurtic distribution in place of normal QML; in particular, the median bias for long-run VaR predictions shrinks significantly with the tail heaviness of the return distribution when no variance targeting is used, but does not under variance targeting. To summarize, we conclude that, especially when estimates of unconditional variance are involved in the statistic of interest, one should better avoid variance targeting provided that its computational benefits are not overwhelming.
its synonyms such as for example, non-frequency content analysis, thematic coding or ethnographic content analysis may not figure in the title or abstract of the article. Unless all of these explanations are taken care of, it will be difficult to take up a trend study or even to come to a conclusion about the frequency of the application of QCA either within the respective fields or as a method on its own.  Descriptions of QCA as a method in its own right began to appear in the literature only recently, primarily as an outcome of the interaction between researchers of the American and German intellectual traditions since the 1960s (FLICK, 2009; HSIEH & SHANNON, 2005; KUCKARTZ, 2014; MAYRING, 2014; SCHREIER, 2012). As MERTON (1968) pointed out in his interesting essay, the quantitative- manifest versus qualitative-latent orientations reflect the American vs. European intellectual traditions, respectively. According to him, the qualitative and latent approach to content is close to researchers of a European or more specifically German intellectual tradition, whose training stresses more on the meta perspective of the problem. Thus, individuals of these two traditions, broadly representing the continental and analytical philosophies, are distinct in terms of their understanding of the text as an aspect of reality and in comprehending its meanings. While researchers using an analytical approach assume that reality exists out there independent of the investigator who seeks to understand the reality as objectively as possible, those using the continental philosophical approaches see no such distinction. 
you have it or you don’t. To squeeze it into terms of value means to be challenged by a blur of categories, because the impaired counter-good now stands on both sides of the scale: when the Union exercises power according tothe law, it potentially impairs Member States’ autonomy. In the absence of a “counter-good”, however, and with regard tothe mentioned limitations regarding the facts, the significance of proportionality will not surmount the level of a loose rational basis test (which is fine as long as itis stated openly). Therefore, the AG in OMT is right when he avoids weighing diffuse competence-values in his three-step test. On the other hand, why did he not just skip this step completely? The reasoning there, including the point about insolvency and quantitative limits, is in essence part of the previous necessity and suitability test. Sure, steps and methods are not carved in stone, but if there isto be any effective “weighing” step at all, we need a commitment to weighable values, like Member States’ autonomy or sphere (e.g. here para 37 or here para 130), constitutional identity (e.g. here para 74) or fundamental rights. Facts themselves, if not referred to such a value, cannot be weighed in terms of proportionality.
Part of the slowing of contractile properties in old age might be caused by the glycation of myofibrillar proteins. In support of this, the velocity of movement of actin filaments in in vitro motility assays was reversibly reduced when the myosins were incubated in glucose (28). Such a situation might occur with development of insulin resistance and as the result of oxidative stress caused by impaired mitochondrial function (24). The muscles from old, diabetic rats, and diabetic humans, have increased glycation of myofibrillar proteins (32). It remains to be seen to what extent glycation of myofibrillar proteins can be prevented with old age, but the unloaded shortening velocity of types I and IIa fibers also is reduced in master athletes. (14). Whatever the cause, this slowing of contractile properties would lead to a reduction in power- generating capacity of the muscle tissue above that caused by the shift in fiber-type composition. It also is important to consider that the changes in fiber-type distribution and/or atrophy that underlie the age-related changes in muscle also may have implications for the stiffness of the muscle. Itis known that type II fibers in animals are more compliant than type I (12), so any atrophy of type II or increased distribution of type I fibers with aging should in theory increase the muscle stiffness (which is, of course, activation modulated). The implications of increased muscle stiffness for tendon function will be considered in the next section.