(iii) the Quality Design Model, modeling dataand processes together in order to model process-based improvement actions. As for techniques to produce artifacts, design patterns for quality improvement are proposed. Quality improvement is a complex activity that typically requires investments in terms of money andof people skills. The reuse of solutions and experiences can be very useful in supporting quality improvement, and can reduce time and costs considerably. For instance, a variety of techniques for data improvement are proposed in the literature and can be adopted as the basis for design patterns. The most straightforward solution suggests the adoption ofdata-oriented inspection and rework techniques, such as data bashing or data cleaning . These techniques focus on data values and can solve problems related to data accuracy anddata consistency quality dimensions . A fundamental limitation of these techniques is that they do not prevent future errors. They are considered appropriate only when data are not modified frequently . On the other hand, a more frequent use ofdata bashing anddata cleaning algorithms involves high costs that can be diﬃcult to justify.
Differences in TTR andqualityof anticoagulation control have been reported previously between patients controlled by GPs with those controlled by haematologic clinics. In a meta‐ analysis by Baker et al, 25 8 studies with more than 22 000 patients demonstrated a mean TTR of 55%, being significantly lower when the control was carried out by GPs vs Hematology (63 vs 51%). In our study, the mean TTR estimated by the Rosendaal method had higher values than in the American study. In contrast, mean TTR in our analysis was significantly higher in patients controlled by GPs than by haematologists. In addition, the higher propensity towards poorer control associated with being the haematologist responsible for control of anticoagulation was not modified when the analyses were adjusted for the variables associated with poor control of anticoagulation such as a high HAS‐BLED score or history of bleeding. It should be noted that we are not comparing professionals (haematologist vs primary care physicians) but two differing anticoagulation managementsystems (including number of visits, patient accessibility and availability of important clinical information). However, there are other models to assess the qualityof anticoagulation. Hou et al 26 performed a systematic review of 8 clinical trials and 9
The generation of knowledge begins when an employee has an idea and transmits it to other members of the organization through a cycle which feeds back on itself and allows for learning. Software systems such as wikis, forums, bulletin boards, or blogs, support themselves on the basis of this factor. Fong and Choi (2009) high- light the fact that this exchange of knowledge is the first step towards its management. Spontaneous meet- ings in the corridor or at the coffee machine make for experience to be acquired by individuals in their daily practice, sharing information with their colleagues about specific cases. This knowledge is, to a large degree, tacit. The exchange of informal knowledge is thus defined as all the forms of exchange that exist, along with all the institutionalized forms of exchange of knowledge that exist (Fong and Choi, 2009). Furthermore, Lu and Tsai (2004) hold that organiza- tions ought to focus on the creation of knowledge in order to prevent their existing knowledge from be- coming rapidly obsolete. Tserng and Lin (2004) affirm that the exchange of experiences and re-utilization of knowledge brings other benefits with it, such as a reduced need to consult previous projects, an improve- ment in the qualityof solutions and a minimization of the time and costs involved in finding solutions to problems, as there is no need to constantly find answers for the same questions.
Other likely benefits of the RTPI systems are as follows: increased willingness to pay; more efficient travelling through better use of waiting time; positive psychological effects such as reduced uncertainty, increased feeling of personal security, creation of a general sense of trust in the PT system, increased easiness of use; better overall image of the system; and greater passenger satisfaction. Several studies have dealt with these issues coming to similar conclusions that RTPI systems offer several benefits to PT users and that the majority of users are quite satisfied with them [Lappin, 2002; Tang & Thakuriah, 2012]. Researches in this field have examined the potential effect of the introduction of RTPI systems on the perceived qualityof service and bus services performance. Politis et al. (2010) evaluated a bus passenger information system from the users’ point of view in the city of Thessaloniki, Greece. The analysis performed on the data collected from the survey of both regular and circumstantial PT users in the city showed that the existing RTPI system is generally evaluated positively. Satisfaction levels were quite high, over 80% for both – the content and the reliability of the information given [Politis et al., 2010]. Although extensive studies have been conducted on travel behaviour changes as a result ofInformationand Communications Technology (ICT), in general, studies of the nature in the case of real-time transit informationsystems are relatively few [Tang & Thakuriah, 2012].
In the paper we have shown how to model the dataof Idea Management Sys- tems using Semantic Web principles. By doing so we have described the research and difficulties that emerge in the process of designing a domain based ontol- ogy. However, we perceive the presented ontology only a first step to achieve our goals. The ontology lays foundations for knowledge management based on interlinking of enterprise systemsand web assets to increase information aware- ness and help in innovation assessment. In terms of future work, we plan to experiment with interlinking Idea Managementdata with other specific systemsand research on possibilities of automatic ranking and recommendation of ideas. Furthermore, the evaluation presented in this paper shows that experimenting with new systems can depict lacks of the ontology, therefore we shall continue its improvement to reflect Idea ManagementSystemsdata as best as possible.
"Rationale. The rationale of the model is based on three basic premises. First, all organisms are data, information, knowledge systems. They could not deal with the external world without them. Second, information is a state of consciousness (i.e., awareness). Thus, information is a cognitive/affective process and the products of that process (Miller, 1978). The focus is on the product andmanagementof these processes (Drucker, 2001). Third, technology augments the human capacities and the products there from. (Englebart,1962).
76 Consequently, land use change has become an area of particular concern due to rapid land conversion practices in the highlands of the country. In recent decades, Remote Sensing (RS) with multi-temporal high-resolution satellite data has been widely used to obtain land cover information such as degradation level of forests and wetlands, rate of urbanization, intensity of agricultural activities, and other human-induced changes (Yuksel et al., 2008). However, these bio-physical approaches do not give information about why changes occur. Understanding land use/cover study requires an understanding of people and their societal situation, their priorities, livelihood strategies, views on the land, and the wider implications of social, political, cultural, biophysical and institutional factors, among others (Maro, 2011). Incorporations of local experiences of key informants in the community provide information on past, present and expected future land use changes (Sandewall et al., 2001). Therefore, it is necessary to go beyond disciplinary trend studies and examine methods for integrating LULC and social research to get knowledge and the experiences of different stakeholders. Therefore, integration of the remote sensing and household survey are important tools to study changes in land cover patterns and dynamics in order to obtain rapid, economical, reliable, and accurate results (Sertel et al., 2008). As explicitly stated by Maro (2011), one of the merits of using qualitative research in social science and survey research methods to understand local perceptions of land use change is its obvious contribution to answer the questions ‘why is change occurring?’ and ‘so what?. Klintenberg et al. (2007) used individual semi- structured interviews with local farmers to understand whether national and local perceptions of environmental change in central Northern Namibia were related. These and other similar studies show that a combination of local and scientific knowledge can lead to more useful assessment of land use change and its implications for local land- users and managers (Klintenberg et al., 2007). Hence, the integration ofinformation from household surveys anddata on land cover changes derived from remote sensing improves our understanding of the causes and processes of LULC changes (Benoît et al., 2000).
The knowledge in libraries about informationmanagement is still relevant in the internet world. Managementof bibliographic data, metadata formats, use of authority systems, thesauri and vocabularies will make access to information more easy and better structured. But these tools are evolving. Content is growing and also the type ofinformation is changing. But mainly the internet is moving from a network that links pages to a network that links data: Web ofData. The real value of the Web ofData lies in the relationships between the data. These relationships (referred to as links) put data in context and enrich their meaning and expressiveness.
Jorgensen et al. (2006) and Jorgensen (2008), define three different levels of integration: “correspondence” refers to cross references and internal coordination, “generic” which is the understanding of generic processes and tasks in the management cycle, and “integration”, the creation of a culture of learning, stakeholder participation and continuous improvement of the performance. Regarding MS integration, Karapetrovic and Willborn (1998) define three main elements of a standardized MS which can be integrated at different levels, namely goals, processes, and resources. Karapetrovic et al. (2006) conducted an emprirical study in order to study the extent of integration of these elements, obtaining responses from 176 Catalan organizations with multiple cross-functional certificates like ISO 9001 or ISO 14001. The authors found a high level of integration regarding the extent of the integration of the human resources, the company policy, objectives, the management system manual, and the processes of document control, record control, auditing, andmanagement review. However, the authors found that aspects such as the use of integrated records, instructions or procedures, found at tactical organizational levels, or the planning, determination of requirements, product realization and other internal business processes, seemed to be integrated to a lesser extent. In the same direction, Bernardo et al. (2009) empirically studied the integration of environmental with other MSs in Spain. To this end, an empirical study was carried out on 435 companies that were registered to multiple management system standards, including ISO 14001: 2004 and ISO 9001: 2000 at the minimum. Overall, 362 of those organizations indicated that they had integrated all or at least some of their standardized managementsystems. In particular, 14% of organizations did not integrate their MSs, 7% integrated only some of them, and 79% integrated all their MSs.
Grammar begins with the ability to combine words to create a new, higher level unit (a syntagm—two or more linguistic elements that occur sequentially in the chain of speech and have a specific relationship). This means that in order to create a syntagmatic relationship between two or more vocabulary words, different word cat- egories have to be distinguished, specifically nouns (objects) and verbs (actions). To create a simple phrase, only two types of elements are indeed required: nouns (corresponding to the so-called nominal phrase) and verbs (corresponding to the so-called verbal phrase). The crucial point in emerging grammar is not just the complexity of the lexical/semantic system. What is really important is to have words corresponding to different classes that can be combined to form a higher level unit (syntagm, phrase, and sentence). One of the words has to refer to an object (noun); the other is an action (verb). A sentence has to contain a subject (noun) and a verb, indicating that two different word categories are required.
ContentGuard, Inc. is driving the standard for interoperability in Digital Rights. The company's broad foundation portfolio of DRM system patents, and its Rights Expression Language, XrML (eXtensible rights Markup Language) were originally developed at the Xerox Palo Alto Research Center (PARC). ContentGuard is driving the adoption of XrML as the industry standard for access and usage rights. XrML has been selected as the basis for the Moving Picture Expert's Group (MPEG) and the Open eBook Forum (OeBF) Rights Expression Language, and has been contributed to the Organization for the Advancement of Structured InformationSystems (OASIS) Rights Language Technical Committee. Launched in April 2000, ContentGuard conducts its operations in Bethesda, MD, and El Segundo, CA. The company is owned by Xerox Corporation (NYSE:XRX), with Microsoft Corporation (NASDAQ: MSFT) holding a minority position.
A long-standing task related to Ramsar site dataandinformation has been the establishment and maintenance of a standard record of changes to the ecological character of Ramsar Sites reported under Article 3.2 of the Convention. Resolution X.15 on Describing the ecological character of wetlands, anddata needs and formats for core inventory: harmonized scientific and technical guidance now includes an ecological character description sheet, containing a section which can be used as a simple mechanism for reporting change. Work will be undertaken in future to establish how best to incorporate these ecological character descriptions and Article 3.2 reports from Parties into the RSIS.
Already assembled products coming from in process are inspected in next process. Next process information is saved in separated files (one file per inspection jig) as raw data. Then files are pulled to a controlled USB flash memory from the measurement control box, passed to an off-line computer in order to be sorted and put together into a file prior to be passed to a statistical application and start to generate plots and next reports that tell important information about the actual process. The operations regarding the statistical application depend on what kind ofinformation the engineer will need. As it can be seen in figure 6 there are two departments gathering information from the line. As part of the Atarimae, engineers must accomplish these actions every day.
In both cases, the team shares the same final information-seeking purpose, but in the first case each member is in charge of fulfilling a different part of it, while in the second case a more holistic approach is undertaken and everybody tries to achieve the whole information-seeking purpose. Then, for example, if the information-seeking purpose is to elaborate the state-of-the-art of a topic, each member of the research team can take care of looking for relevant documents published within different ranges of years so that, in sum, they cover the whole desired period. Alternatively, all of them can perform the information-seeking activity considering the whole period of time. In terms ofinformation retrieval, both approaches differ in terms of precision and recall, which are the most common measures of effectiveness in this field (Manning et al., 2008). Collaborative work can potentially provide a higher recall as the number of documents that have to be processed by each CS researcher is smaller and then it is more difficult to let a relevant document unidentified, but the precision can be lower as determining if a document is relevant or not depends only on the criterion of one CS researcher. On the contrary, results obtained by teams working in parallel can potentially be more precise as the findings can be contrasted: if all or most of the CS researchers have found the same document and have considered it relevant, then it is almost sure that the document is truly relevant. In this case, however, it is more probable for some relevant documents to remain unidentified at the end of the process as the document collection is bigger, and then the CS researcher potentially has to perform more and more complex information-seeking tasks and spend more time in each of them.
Reusser, Kurt. Tutoring Systems an Pedagogical Theory: Representational Tools for Understanding, Planning and Reflection in Problem Solving. En Computers as Cognitive Tools. Lajoie and Derry, ed. 1993. Ritter, S. Communication, Cooperation and Competition Among Multiple Tutor Agents. AI-ED97. Eighth World Conference on Artificial Intelligence in Education - Workshop V: Pedagogical Agents. 1997.
The expansion of e-learning in higher education has been well noted in the literature (Buzdar et al, 2016). The growing variety of Massive Open Online Course (MOOC) of- ferings (Salmon et al, 2015) and their ambition to obtain a credit-bearing status (Black- mon, 2016) denotes just that. So does the emergence of the “post-traditional learner,” who craves control over how, where, and when to acquire the knowledge (Bichsel, 2013). Maintaining academic integrity becomes an increasingly challenging exercise as physi- cal entities become represented by virtual aliases, when the class size increases, when students are geographically dispersed, and when the teaching and assessment roles be- come disaggregated. The traditional methods for ensuring the trust relationship stays intact are difficult to translate to learning environments where students and instructors are separated by the time and space gap, and use technology to communicate (Amigud, 2013). These methods stipulate how, when, and where the assessment activities take place and are, at least partly, responsible for the disparity in expectations and experiences of post-traditional learners. When applied to the e-learning context, the traditional strate- gies negate the very premise of openness and convenience, let alone administrative and economic efficiency. Hence emerges the need for a robust academic integrity strategy
Iglesias, C. A., Garijo, M., González, J. C., & Velasco, J. R. (1998). Analysis and design of multiagent systems using MAS-CommonKADS. In M. P. Singh, A. S. Rao, & M. J. Wooldridge (Eds.), Intelligent agents IV: Agent theories, architectures, and languages ATAL 97 (Vol. 1365, pp. 313–326)., LNAI Springer: Berlin. Google Scholar (http://scholar.google.com