Methods and times

Top PDF Methods and times:

MATERIALS AND METHODS Literature search and effect size estimation

MATERIALS AND METHODS Literature search and effect size estimation

plant groups based on Angiosperm Phylogeny Group III (2009) using PHYLOMATIC (Webb & Donoghue 2005). Relationships at family and species level of several genera were resolved with the help of published phylogenetic studies (Figure S1). Species lacking phylogenetic information were placed as polytomies at the root of their family or genus (Koricheva et al. 2013) and all branch lengths were trans- formed to one (Verd u & Traveset 2004). In addition, the branch lengths of our phylogenetic tree were adjusted with the Bladj algorithm of Phylocom 3.34b program (www.phy- lodiversity.net/phylocom), while calibration points were based on the estimated divergence times of major plant groups (Hedges et al. 2006). This ultrametric tree was converted into a phylogenetic correlation matrix (P) that has the standar- dised shared branch length distance of each species in off- diagonals and ones in the main diagonal (Lajeunesse 2009; Koricheva et al. 2013). Phylogenetic tree edition was per- formed in R (R Development Core Team 2011) and Mes- quite v. 2.75 (Maddison & Maddison 2011). To analyse each predictor or moderator variable, we constructed a subset tree to estimate P for each category, which contains only the spe- cies present in that particular comparison and retains all the branch length information found in the hypothesised tree (Fig. S1). Because results were similar between traditional and phylogenetic meta-analyses, we interpreted and discussed results of the latter. Results of traditional meta-analysis are provided in Table S3.

13 Lee mas

A Near extinction Event in Lynx

METHODS AND MATERIALS Molecular methods

economic value put on lynx pelts in the past, we feel that hunting statistics present convincing evidence of the severity of the decline, and we therefore find this explanation unlikely. Second, there may have been undetected migration in recent times (i.e., during the latter half of the 20th century) from the Finnish lynx population, which would have diminished the effects of the bottleneck. However, genetic analyses of Finnish lynx show a very low estimate of gene flow between Sweden and Finland, and we believe that this number reflects ancient migration events that took place before the bottleneck. The last and most probable explanation concerns the number of loci used. The use of only a few loci restricts the power of the tests we chose, although our 11 polymorphic loci exceeded the minimum of 10 recommended for the BOTTLENECK tests. To achieve powers of 0.5 or more for the k test and the g test, more than 30 loci are needed (Reich et al. 1999). Our data set of 12 loci resulted in a power of between 0.1 and 0.4. Few studies have scored the number of loci required to attain reasonable power, presumably because reliable relatedness estimates and/or F-statistics can be obtained with far fewer loci (and analyses are costly). However, keep in mind that a minimum of 10 polymorphic loci is recommended for the heterozygote excess method implemented in BOTTLENECK. In a performance test of this method on 11 microsatellite data sets from populations known to have experienced a demographic bottleneck and scored at four to 10 loci, signs of a bottleneck were detected in nine of these under the infinite allele model. Assuming a stepwise mutation model, only five of the populations showed signs of a bottleneck (Luikart and Cornuet 1998). All but one of these populations were scored for fewer than 10 loci, and the population scored at 10 loci showed no signs of a bottleneck. This is similar to our study findings, where, despite having 11 polymorphic loci, we failed to detect a bottleneck.

5 Lee mas

Hypericin Extraction Methods and Purification

Hypericin Extraction Methods and Purification

residue, and sonicated for about 30 minutes. Finally, the red supernatant was separated and kept in a dark glass bot- tle for further purification. The extraction was repeated several times adding 24 mL portions of the above- men- tioned solvent on the plant residue till colorless or pale purple supernatant was obtained. All portions were mixed and the solvents were evaporated to dryness using nitro- gen gas. Then, dried residue was dissolved in 4 mL of HPLC mobile phase. The resulting solution was further purified by a procedure described below before injection to HPLC.

5 Lee mas

LC/MS for the degradation profiling of cough cold products under forced conditions / A  Marín and C  Barbas

LC/MS for the degradation profiling of cough cold products under forced conditions / A Marín and C Barbas

The techniques employed for the analysis of stabil- ity samples can be titrimetric, spectophotometric and chromatographic methods. A critical review of these methods is shown in the work of Bakshi and Singh [3]. Within the chromatographic methods, HPLC is the most employed due to its high-resolution capacity, sensitivity and specificity, and because a lot of com- pounds can be analysed using this technique. Nev- ertheless, some times the information that supplies the HPLC is not enough for identity confirmation of known and unknown degradation products and their selective determination. For this purpose, hyphenated LC/MS technique that supplies molecular weight and fragmentation information is an indispensable tool of analysis. But in many cases, method development is required because HPLC/UV methods are not eas- ily transferred to LC/MS due to many of the HPLC applications require non-volatile buffer solutions for optimal separation that the LC and MS interface does not tolerate.

11 Lee mas

Efficient protocols for the extraction of microbial DNA from the rhizosphere of hydrophilic forests in Chile

Efficient protocols for the extraction of microbial DNA from the rhizosphere of hydrophilic forests in Chile

prevent the correct hybridization of primers to template DNA, inhibit PCR amplification and restriction enzyme digestion. The same processes are also influenced by the phenolic groups of the humic acids, which denature proteins by bonding to amides. Furthermore, these molecules can be oxidized to form quinones that bind covalently to DNA (Lakay et al., 2007). The important factors to consider in regard to DNA extraction are as follows: (1) efficiency, which is accomplished by physical, chemical and enzymatic processes to ensure rupture of the resistant cell structures that are characteristic of some soil microorganisms or spores and (2) the removal of contaminants (e.g., humic acids) that are extracted together with nucleic acids and interfere with subsequent molecular analysis (O’Donnell et al., 1999). The cell disruption protocols can be classified into two categories: those in which the cells are lysed within the soil (direct extraction) and those in which the cells are removed from the soil mix and the extraction is performed on isolated cells (Courtois, 2001). The technique of direct lysis, which was used in this study, is the most widely used because it gives a higher DNA yield and has less bias with respect to the diversity of the microbial community (Miller et al., 1999). Direct extraction methods include different protocols such as grinding in liquid nitrogen, mixing along with homogenization using short homogenization times and heat-shock treatment in a microwave oven with different chemical extractants, such as the ionic detergent sodium dodecyl sulfate (SDS), which acts by dissolving the hydrophobic material in the cells. The material is then subjected to heat with chelating agents such as ethylenediamine- tetraacetic acid (EDTA) or Chelex 100 (Robe et al., 2003; Lakay et al., 2007).

8 Lee mas

Quantitative methods

Quantitative methods

An outflow of cash occurs when a company transfers funds to another party (either physically or elec- tronically). Such a transfer could be made to pay for employees, suppliers and creditors, or to pur- chase long-term assets and investments, or even pay for legal expenses and lawsuit settlements. It is important to note that legal transfers of value through debt - a purchase made on credit - is not re- corded as a cash outflow until the money actually leaves the company's hands.

10 Lee mas

Training students for quality: ideas and methods

Training students for quality: ideas and methods

The approach presented here is given in the form of suggestions. It is difficult to make strong claims about it until solid evidence is collected from the field and analyzed critically. Anecdotal data from the classroom, in particular comments from students as well as their IPDRs, suggest that indeed, this approach induces favorable reactions in terms of process and attitudes. In particular, students taught along the lines advocated in this paper tend to stop making avoidable meaning errors and to learn to make their own decisions faster than students taught with more traditional methods. However, this cannot be taken as conclusive proof: results may be due to specific interactions between the method, the students and the instructor’s personality rather than to the method as such.

10 Lee mas

The collision avoidance problem: methods and algorithms

The collision avoidance problem: methods and algorithms

There have been built methods for maintaining separation between aircraft in the current airspace system. Humans are an essential element in this process due to their ability to integrate information and make judgments. However, because failures and operational errors can occur, automated systems have begun to appear both in the cockpit and on the ground to provide decision support and to serve as traffic conflict alerting systems. These systems use sensor data to predict conflicts between aircraft and alert humans to a conflict and can provide commands or guidance to resolve the conflict. Relatively simple conflict predictors have been a part of air traffic control automation for several years, and the traffic alert and collision avoidance system (TCAS) has been in place onboard domestic transport aircraft since the early 1990s. Together, these automated systems provide a safety net that should provide normal procedures to help controller and pilot when human actions fail to keep aircraft separated beyond established minimums.

240 Lee mas

Economic Growth – Theory and Numerical Solution Methods

Economic Growth – Theory and Numerical Solution Methods

Suppose that fluctuations in the expenditures/output ratio in a given economy can be interpreted as controlled deviations around a pre-announced target level. Should they then be correlated with exogenous supply shocks? 46 This question could be analyzed by solving the model under different positive and negative values for such correlation and computing levels of implied welfare. This would have clear implica- tions on the optimal way to conduct policy. Changes in the expenditure/output ratio to accommodate supply shocks under a maintained correlation with supply shocks would have to come together with changes in a given tax rate (on consumption, la- bor income or capital income, for instance) to balance the budget. 47 In principle, we should expect that the answer to the optimal correlation question might depend on the type of tax adjustment chosen, so that the answer is two sided: from the point of view of maximizing private agents’ welfare, it is optimal to maintain such corre- lation between the expenditures-to-output ratio and supply shocks, and balance the budget every period by adjusting the fluctuations in expenditures with such tax rate. This analysis would make sense even if we believe that the random deviations from a specified target in the expenditures/output ratio is beyond the control of the economic authority, since there would still be a welfare-maximizing correlation be- tween these fluctuations and supply shocks. The theoretical analysis in the previous paragraph would have characterized the optimal expenditure/tax policy. We could then identify separately supply and fiscal shocks in actual data, possibly through an structural VAR type of analysis. The estimated correlation between supply shocks and innovations in the expenditure/output ratio, together with the observation on the type of taxes which are adjusted most often, would give us the extent to which the correlation used in actual policy making departs from the value predicted as optimal by the model.

543 Lee mas

Chronological versus dental age in subjects from 5 to 19 years: a comparative study with forensic implications

Chronological versus dental age in subjects from 5 to 19 years: a comparative study with forensic implications

osseous structures that could determine biological age and characterize three legally important chronological age groups; thereby, developing a suitable and applicable model for the Colombian population and which is currently employed by the National Institute for Legal Medicine and Forensic Sciences in Bogotá (Instituto Nacional de Medicina Legal y Ciencias Forenses, Seccional Bogotá). Likewise, this study determined that dental maturation was more effective than osseous maturation through the carpogram when estimating biological age. Thus, through this analysis of dental development we can obtain valuable information from deceased humans in a way which facilitates identifying an individual and provides preliminary forensic proof that can guide judi- cial proceedings. For living individuals, the opinion of age within the of forensic sciences, especially forensic dentistry, plays an important judicial role due to the classification of a crime that may have been committed by a juvenile who is less than 14 years of age or who is between 14 and 18 years of age, which will determine how they will be penalized, the place of reclusion, and restoration of rights: under legal age per the Code for Children and Adolescents (Legislation 906 of 2004) and of legal age per the Penal Code (Legislation 1098 of 2006). Similarly, in cases of victimizers, these will be aggravated if the victims are less than 14 years of age. However, there are difficulties when estimating the biological age of an individual with forensic purposes, for this reason macroscopic methods have been devised for chronological age approximation based on osseous development (observation of the shape and state of metamorphosis of the ossification centers, epiphyseal closure, and length of some bones), growth and development of hair on the body and external sexual organs, and development stages, along with dental erruption 2,5 . Hence, in view of the great quantity of

9 Lee mas

A survey of feature selection in Internet traffic characterization

A survey of feature selection in Internet traffic characterization

In recent years, the world has witnessed an explosion of available information in almost ev- ery domain. The sharp increase of the scale of data sets poses a great challenge for scientific researchers when they try to characterize data under research and extract useful knowledge at an acceptable cost. Features are used to convey information as measurable properties to characterize certain aspects of objects under observation in data analysis, machine learning etc. Due to the fast progress of hardware and storage technologies, the scale of feature sets has raised from tens to thousands or even more. In the case of Internet traffic characterization, besides old sample features like protocol category, complex features like Fourier transform of the inter-arrival time of packets [1] are also considered in the latest research works. Handling such a big feature set could be computationally expensive; furthermore, irrelevant and re- dundant features may also decrease the accuracy of characterization results; finally too many features can severely jeopardize the interpretability of results. As a consequence, feature selection serves as a fundamental procedure of preprocessing in big data scenarios before stepping forward to further application of statistical or machine learning techniques. The main objective of feature selection is to select a subset of features as simplified as possible without suffering a significant decline of accuracy for classification or forecasting, i.e. ex- perimenting with a subset in place of full feature set results in equal or better classification accuracy. The process of feature selection can be totally supervised. In many existing re- search initiatives domain experts were required to provide a candidate subset of features con- sidering possible domain relevance. However, nowadays due to the large size of feature sets, purely manual feature selection becomes infeasible. Consequently, various feature selection methods have been proposed to generate core feature subsets utilizing different theories and techniques. Feature selection can provide several advantages: It significantly reduces the computational burden, which in turn increases the performance (running time, precision etc.) of classification or prediction models; By getting rid of redundant, irrelevant features or even noise, further processing results don’t suffer from bad impacts brought by these interfering

22 Lee mas

Althusser – Reading Capital Part 3

Althusser – Reading Capital Part 3

character', is really an example of the structure of temporal insertion that I outlined above. The concepts of history and dynamics then become twins, one popular history), the other learned (dynamics), since the second expresses very accurately the determination of the historical movement on the basis of a structure. This makes it possible to add a third term to these two: diachrony, which does not produce any new knowledge here, since it simply expresses the form of unique linear temporality which is implied by the identification of the first two concepts. But in reality, such a reading of Marx completely ignores the mode of constitution of the concepts of temporality and history in the theory of Capital. It may have been possible to adopt (or interpret) these concepts in their normal sense, i.e., in their ideological use, in a text such as the Preface to A Contribution, from which we started: there they merely have the function of registering and designating a theoretical field which has not yet been thought in its structure. But in the analysis of Capital, as our studies of primitive accumulation and of the tendency of the mode of production have shown, they are produced separately and differentially: their unity, instead of being presupposed in an always already given conception of time in general, must be constructed out of an initial diversity which reflects the complexity of the whole which is analysed. On this point it is possible to generalize from the way Marx posed the problem of the unity of the different cycles of the individual capitals in a complex cycle of the social capital: this unity must be constructed as an 'intertwining' whose nature is initially problematic. On this, Marx writes:

87 Lee mas

IDENTIFICATION AND CONTROL METHODS UTILIZING RANK AND CARDINALITY OPTIMIZATION APPROACH

IDENTIFICATION AND CONTROL METHODS UTILIZING RANK AND CARDINALITY OPTIMIZATION APPROACH

In this section we use the approach presented in [8] to find a equivalent representation of problem P 0 . The need of equivalent representations for rank constraints arise because the rank function has several features that are undesirable in optimization problems. In particular, the rank function is non-smooth, non-linear and non-convex. In the op- timization literature, smoothness and convexity are widely exploited, and the lack of such features in the rank function limits the tools that can be used in the to solve the optimization problem. Thus, equivalent representations aim at overcoming at least one of these undesirables features of the rank function. Recently, equivalent representations have been utilized to avoid the direct treatment of rank constraints ( [8, 94], page 241 of [9]).

103 Lee mas

La producción de gases con efecto invernadero en el estuario del río Palmones

La producción de gases con efecto invernadero en el estuario del río Palmones

The Earth’s climate has changed throughout history. From glacial periods or “ice ages” where ice covered significant portions of the Earth to interglacial periods where ice retreated to the poles or melted entirely, the climate has continuously changed. During the last 2000 years, the climate has been relatively stable. Scientists have identified three departures from this stability, known as the Medial Climate Anomaly, the Little Ice Age and the Industrial Era. An additional warm period has emerged in the last 100 years, coinciding with substantially increasing emissions of greenhouse gases form human activities. Prior to the Industrial Era, the Medial Climate Anomaly and Little Ice Ages, had defined the upper and the lower boundaries of the climate’s recent natural variability and are a reflection of changes in climate drivers (the sun’s variability and volcanic activity) and the climate’s internal variability (referring to random changes in the circulation of the atmosphere and oceans). The issue of whether the temperature rise of last 100 years crossed over the warm limit to the boundary defined by the Medieval Climate Anomaly has been a controversial topic in the science community.

228 Lee mas

Classification of Imaginary motor task from Electroencephalographic Signals: A Comparison of Feature Selection Methods and Classification Algorithms

Classification of Imaginary motor task from Electroencephalographic Signals: A Comparison of Feature Selection Methods and Classification Algorithms

Average of the spectrogram of single trials, for all electrodes and each classes in subject 4 is shown in Figure 2 as an example of the features extracted using STFT. It is observed that the magnitude of the spec- trum decreases at the electrode C3 around 10 Hz and 20 Hz when the subject execute the motor imagery task corresponding to the movement of the right hand. Similar activity is observed at electrode C4 in the same frequency bands when the subject executes the imag- ination activity referring to the left hand movement. This phenomena is observed across subjects and is termed in the literature Event Related De-synchronization (ERD). Based on this, the feature vector is composed of the average activity across time (between 3.5 to 5.5 secs) for each individual frequency. That is, the feature vector represent the power in each frequency band during 2 seconds after the beginning of the motor task. The size of the feature vector is

10 Lee mas

A study in vowels: Comparing phonetic difficulties between languages

A study in vowels: Comparing phonetic difficulties between languages

The second new method I am adding is the Johari window, which is a heuristic method developed, back in 1955, by psychologists Joseph Luft and Harrington Ingham. Heuristics are techniques that solve problems, learn or discover new information through practical methods though in some cases, due to cognitive biases, can lead to systematic errors. With proper care though, heuristics result in easily accesible information in order to solve problems or even abstract issues (Michaelewicz, and Fogel 2000).

13 Lee mas

Strength and porosity evolution of two cement mortar submerged in pig slurry

Strength and porosity evolution of two cement mortar submerged in pig slurry

Slurry composition depends on many factors, including animal physiology, type of feed, facility typology and management and so forth. Its complex chemical composition, with organic and inorganic compounds, varied over time. The three main groups of organic components were: organic acid (acetic, propionic, isovaleric), nitrogenous compounds (primarily ammonia-based) and a number of hydrosulphide compounds deriving from urea denaturation. The result is a compound with a pH ranging from 7 to 8. The Spanish structural concrete code, EHE [4], regards substances with a pH of over 6.5 to be non-aggressive from the standpoint of acid damage. Nonetheless, research has shown that concrete and mortar in contact with such slurries deteriorate systematically, with a decline in their load capacity [5]. Clearly, then, degradation is the outcome of the synergies between various factors. Some researchers have attempted to reproduce this sort of concrete degradation in the laboratory [6], by testing different types of cements after exposure to specific organic salts with varying pH values or analyzing the mechanism involved in cement matrix alterations caused by a combination of acids [7].

14 Lee mas

Mothering and governing: How news articulates hegemonic gender roles in the case of Governors Jane Swift and Sarah Palin

Mothering and governing: How news articulates hegemonic gender roles in the case of Governors Jane Swift and Sarah Palin

This analytical tool is useful because it attends to the underlying presumptions resulting from articulation*connections that are not necessarily conscious or made explicit to the authors who form them or those who make sense of them. Ideology as conceptualized through Hall is firmly placed within the ‘‘function of discourse and of the logic of social processes rather than an intention of the agent’’ (Hall, 1982, p. 88). The function of ideology is understood through Gramsci’s concept of hegemony*an ecology of values, attitudes, beliefs and morality that permeates throughout society resulting in ideological and cultural power exerted by a dominant group (Gramsci, 1971). Because its infiltration is perceived as the norm, it becomes a part of what is considered ‘‘common sense’’ so that the philosophy, culture and morality of the ruling elite dictates the normative philosophy, culture and morality and comes to appear as the natural order of things (Boggs, 1976). Gramsci theorizes hegemony as a process, a constant struggle to define common sense understandings within a culture. ‘‘Hegemony, or any form of articulation, is never final or total’’ which means that re-articulations are always a possibility (Carpentier and Cammaerts, 2006, p. 966).

17 Lee mas

SAMPLING, MEASUREMENT METHODS, AND INSTRUMENTS

SAMPLING, MEASUREMENT METHODS, AND INSTRUMENTS

Dermal and ingestive routes of entry are much more significant than inhalation for a large number of chemicals. For example, a fifteen-minute exposure of the hands and forearms to liquid glycol ethers [2-methoxy-ethanol (ME) and 2-ethoxy-ethanol (EE)] will result in a dose to the body well in excess of the eight-hour inhalation dose at their recommended air exposure limits. (Biological monitoring for the urinary metabolites methoxyacetic acid and ethoxyacetic acid was used to estimate the absorption via skin and lung.) Unfortunately, many industrial hygienists are only familiar with air sampling and fail to evaluate significant exposures caused by surface contamination. Wipe sampling is an important tool of worksite analysis for both identifying hazardous conditions, and in evaluating the effectiveness of personal protective equipment, housekeeping, and decontamination programs. As described below, wipe sampling is an important tool for assessing compliance with certain OSHA requirements even though there are few specific criteria for acceptable surface contamination amounts.

65 Lee mas

TítuloOptimization of "Saccharomyces cerevisiae" α galactosidase production and application in the degradation of raffinose family oligosaccharides

TítuloOptimization of "Saccharomyces cerevisiae" α galactosidase production and application in the degradation of raffinose family oligosaccharides

ScAGal not only has strong resistance to treatment with all the proteases tested, but also a higher enzy- matic activity in their presence (Fig.  3a). After 37 °C for 1  h with trypsin, chymotrypsin, proteinase K, subtili- sin and pepsin, the enzyme’s residual activity was 122, 132, 142, 117 and 164%, respectively, varying slightly up to 16  h. The protease resistance of several α-Gals has been tested, but it was noted that most trials used short incubation times of 30–60  min, without longer incu- bation times being tested. ScAGal was more protease- resistance compared to most others. The enzymes PCGI from Pleurotus citrinopileatus [23] and Aga-BC7050 [38] were activated with α-chymotrypsin and trypsin, but were inhibited by proteinase K. ABGI from Agari- cus bisporus only showed resistance to α-chymotrypsin [34]. TtGal27A from Thielavia terrestris [32], PDGI from Pleurotus djamor [35] and rAgas2 isolated from the gut metagenome of Hermetia illucens [42] retained 90, 60 and 70%, respectively, of the initial activity in the pres- ence of neutral proteases. On the other hand, the degly- cosylated ScAGal was also slightly activated by pepsin (103% residual activity) and showed strong resistance to subtilisin (98% residual activity), and some tolerance to trypsin (52% residual activity), chymotrypsin (53% residual activity) and proteinase K (45% residual activity) after 1 h treatment (Fig. 3b). In the food and feed indus- try, combinations of enzymes are used, including α-Gals,

17 Lee mas

Show all 10000 documents...