Revolvy Trivia Quizzes Revolvy Lists Revolvy Topics

Business semantics management

Business Semantics Management [1] [2] (BSM) encompasses the technology, methodology, organization, and culture that brings business stakeholders together to collaboratively realize the reconciliation of their heterogeneous metadata; and consequently the application of the derived business semantics patterns to establish semantic alignment[3] between the underlying data structures.

BSM is established by two complementary process cycles each grouping a number of activities. The first cycle is the semantic reconciliation cycle, and the second cycle is the semantic application cycle. The two cycles are tied together by the unification process. This double process cycle is iteratively applied until an optimal balance of differences and commonalities between stakeholders are reached that meets the semantic integration requirements. This approach is based on research on community-based ontology engineering ([1] [2] ) that is validated in European projects, government and industry.

Semantic Reconciliation

Semantic reconciliation is a process cycle constituted of four subsequent activities: scope, create, refine, and articulate. First, the community is scoped: user roles and affordances are appointed. Next, relevant facts are collected from documentation such as, e.g., natural language descriptions, (legacy) logical schemas, or other metadata and consequently decomposing this scope in elicitation contexts. The deliverable of scoping is an initial upper common ontology that organizes the key upper common patterns that are shared and accepted by the community. These upper common patterns define the current semantic interoperability requirements of the community. Once the community is scoped, all stakeholders syntactically refine and semantically articulate these upper common patterns.

Unification

During unification, a new proposal for the next version of the upper common ontology is produced, aligning relevant parts from the common and divergent stakeholder perspectives. If the semantic reconciliation results in a number of reusable language-neutral and context-independent patterns for constructing business semantics that are articulated with informal meaning descriptions, then the unification is worthwhile.

Semantic Application

Semantic application is a process cycle constituted of two subsequent activities: select and commit where the scoped information systems are committed to selected consolidated business semantic patterns. This is done by first selecting relevant patterns from the pattern base. Next, the interpretation of this selection is semantically constrained. Finally, the various scoped sources and services are mapped on (read: committed to) this selection. The selection and axiomatization of this selection should approximate the intended business semantics. This can be verified by automatically verbalization into natural language, and validation of the unlocked data. Validation or deprecation of the commitments may result in another iteration of the semantic reconciliation cycle.

Business semantics

Business semantics [1] are the information concepts that live in the organization, understandable for both business and IT. Business Semantics describe the business concepts as they are used and needed by the business instead of describing the information from a technical point of view.

One important aspect of business semantics is that they are shared between many disparate data sources. Many data sources share the same semantics but have different syntax, or format to describe the same concepts.

The way these business semantics are described is less important. Several approaches can be used such as UML, object-role modeling, XML, etc. This corresponds to Robert Meersman’s statement that semantics are "a (set of) mapping(s) from your representation language to agreed concepts (objects, relationships, behavior) in the real-world".[4] In the construction of information systems, semantics have always been crucial. In previous approaches, these semantics were left implicit (i.e. In the mind of reader or writer), hidden away in the implementation itself (e.g., in a database table or column code) or informally captured in textual documentation.[5] According to Dave McComb, "The scale and scope of our systems and the amount of information we now have to deal with are straining that model."[6]

Nowadays, information systems need to interact in a more open manner, and it becomes crucial to formally represent and apply the semantics these systems are concerned with.

Application

Business semantics management empowers all stakeholders in the organization by a consistent and aligned definition of the important information assets of the organization.

The available business semantics can be leveraged in the so-called business/social layer of the organization. They can for example be coupled to a content management application to provide the business with a consistent business vocabulary or enable better navigation or classification of information, leveraged by enterprise search engines to make richer semantic-web-ready websites, etc..

Business semantics can also be used to increase operational efficiency in the technical/operation layer of the organization. It provides an abstracted way to access and deliver data in a more efficient manner. In that respect, it is similar to Enterprise Information Integration (EII) with the added benefit that the shared models are not described in technical terms but in a way that is easily understood by the business.

Collibra is the first organization to commercialize the idea behind business semantics management. Collibra's approach to Business Semantics Management is based on DOGMA, a research project at the Vrije Universiteit Brussel.

See also
References
  1. De Leenheer, Pieter (2010). "Business Semantics Management: A Case Study for Competency-centric HRM". Elsevier.
  2. De Leenheer, Pieter (2009). "On community-based Ontology Evolution. PhD thesis, Vrije Universiteit Brussel".
  3. http://www.information-management.com/media/pdfs/collibra.pdf
  4. Sheth, Amit (1997). "Data Semantics: what, where and how?". Proceedings of the 6th IFIP Working Conference on Data Semantics (DS-6). Chapman and Hall. pp. 601–610.
  5. Morgan, Tony (2005). "Expressing Business Semantics" (PDF). Presentation at the European Semantic Web Conference (2005).
  6. DMReview.com. "Why is Business Semantics the New Hot Topic?". Retrieved 23 November 2008.
Continue Reading...
Content from Wikipedia Licensed under CC-BY-SA.

Formal methods

topic

In computer science , specifically software engineering and hardware engineering , formal methods are a particular kind of mathematically based techniques for the specification , development and verification of software and hardware systems. The use of formal methods for software and hardware design is motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analysis can contribute to the reliability and robustness of a design. Formal methods are best described as the application of a fairly broad variety of theoretical computer science fundamentals, in particular logic calculi, formal languages , automata theory , discrete event dynamic system and program semantics , but also type systems and algebraic data types to problems in software and hardware specification and verification. Taxonomy Formal methods can be used at a number of levels: Level 0: Formal specification may be undertaken and then a program developed from this informally. This has been dubbed ...more...



Data modeling

topic

Data modeling in software engineering is the process of creating a data model for an information system by applying certain formal techniques. Overview Data modeling is a process used to define and analyze data requirements needed to support the business processes within the scope of corresponding information systems in organizations. Therefore, the process of data modeling involves professional data modelers working closely with business stakeholders, as well as potential users of the information system. There are three different types of data models produced while progressing from requirements to the actual database to be used for the information system. The data requirements are initially recorded as a conceptual data model which is essentially a set of technology independent specifications about the data and is used to discuss initial requirements with the business stakeholders. The conceptual model is then translated into a logical data model , which documents structures of the data that can be implemen ...more...



Oracle BI server

topic

Oracle Business Intelligence Enterprise Edition is a business intelligence server developed by Oracle . It includes advanced business intelligence tools built upon a unified architecture. The server provides centralized data access to all business related information in a corporate entity. It integrates data via sophisticated capabilities from multiple sources. Component of OBIEE Oracle BI Server is a query, reporting and analysis server and provides services to the other components of the Business Intelligence suite such as Data mining , Reporting, and Analytic Applications. The Server uses ODBC 2.0 which provides a standard software API method for using database management systems and JDBC ( Java Database Connectivity ) which is an API for the Java programming language. The BI server compiles incoming query requests into an executable code and then execute the code. Clients of the BI Server work with a logical data independent of the data sources, and they submit them to the BI server. The server translat ...more...



Extended Semantic Web Conference

topic

The Extended Semantic Web Conference (abbreviated as ESWC ), formerly known as the European Semantic Web Conference , is a yearly international academic conference on the topic of the Semantic Web . The event began in 2004 as the European Semantic Web Symposium. The goal of the event is "to bring together researchers and practitioners dealing with different aspects of semantics on the Web ". Topics covered at the conference include linked data , machine learning , natural language processing and information retrieval , ontologies , reasoning , semantic data management , services, processes, and cloud computing , social Web and Web science , in-use and industrial, digital libraries and cultural heritage, and e-government . List of conferences Past and future ESWC conferences include: Year Conference City Country 2016 ESWC2016 Anissaras, Crete Greece 2013 ESWC2013 Montpellier France 2012 ESWC2012 Heraklion Greece 2011 ESWC2011 Heraklion Greece 2010 ESWC2010 Heraklion Greece 2009 ESWC2009 Heraklion Greece 2008 E ...more...



Semantic query

topic

Semantic queries allow for queries and analytics of associative and contextual nature. Semantic queries enable the retrieval of both explicitly and implicitly derived information based on syntactic, semantic and structural information contained in data. They are designed to deliver precise results (possibly the distinctive selection of one single piece of information) or to answer more fuzzy and wide open questions through pattern matching and digital reasoning . Semantic queries work on named graphs , linked-data or triples . This enables the query to process the actual relationships between information and infer the answers from the network of data. This is in contrast to semantic search , which uses semantics (the science of meaning) in unstructured text to produce a better search result (see natural language processing ). From a technical point of view semantic queries are precise relational-type operations much like a database query . They work on structured data and therefore have the possibility to uti ...more...



Semantic data model

topic

A semantic data model in software engineering has various meanings: It is a conceptual data model in which semantic information is included. This means that the model describes the meaning of its instances. Such a semantic data model is an abstraction that defines how the stored symbols (the instance data) relate to the real world. It is a conceptual data model that includes the capability to express information that enables parties to the information exchange to interpret meaning (semantics) from the instances, without the need to know the meta-model. Such semantic models are fact-oriented (as opposed to object-oriented). Facts are typically expressed by binary relations between data elements, whereas higher order relations are expressed as collections of binary relations. Typically binary relations have the form of triples: Object-RelationType-Object. For example: the Eiffel Tower Paris. Typically the instance data of semantic data models explicitly include the kinds of relationships between the various dat ...more...



Data governance

topic

Data governance is a defined process an organization follows to ensure that high quality data exists throughout the complete lifecycle. The key focus areas of data governance include availability, usability, integrity and security. This includes establishing processes to ensure important data assets are formally managed throughout the enterprise, and the data can be trusted for decision-making. Often the processes used in data governance include accountability for any adverse event that results from data quality . Data governance also describes an evolutionary process for a company, altering the company’s way of thinking and setting up the processes to handle information so that it may be utilized by the entire organization. It’s about using technology when necessary in many forms to help aid the process. When companies desire, or are required, to gain control of their data, they empower their people, set up processes and get help from technology to do it. Organizationally, a data steward role exists (or is c ...more...



Database

topic

An example of output from an SQL database query. A database is an organized collection of data . A relational database , more restrictively, is a collection of schemas , tables , queries , reports, views , and other elements. Database designers typically organize the data to model aspects of reality in a way that supports processes requiring information, such as (for example) modelling the availability of rooms in hotels in a way that supports finding a hotel with vacancies. A database-management system ( DBMS ) is a computer-software application that interacts with end-users , other applications, and the database itself to capture and analyze data. A general-purpose DBMS allows the definition, creation, querying, update, and administration of databases. A database is not generally portable across different DBMSs, but different DBMSs can interoperate by using standards such as SQL and ODBC or JDBC to allow a single application to work with more than one DBMS. Computer scientists may classify database-managem ...more...



YAWL

topic

YAWL ( Yet Another Workflow Language ) is a workflow language based on workflow patterns . The language is supported by a software system that includes an execution engine, a graphical editor and a worklist handler. The system is available as Open source software under the LGPL license. Production-level uses of the YAWL system include a deployment by first:utility and first:telecom in the UK to automate front-end service processes, and by the Australian film television and radio school to coordinate film shooting processes. The YAWL system has also been used for teaching in more than 20 universities. Features Comprehensive support for the workflow patterns. Support for advanced resource allocation policies, including four-eyes principle and chained execution. Support for dynamic adaptation of workflow models through the notion of worklets. Sophisticated workflow model validation features (e.g. deadlock detection at design-time). XML-based model for data definition and manipulation based on XML Schema , XPath ...more...



Symantec

topic

Symantec Corporation (commonly known as Symantec ) is an American software company headquartered in Mountain View, California , United States . The company provides cybersecurity software and services. Symantec is a Fortune 500 company and a member of the S&P 500 stock-market index. The company also has development centers in Pune, Chennai and Bengaluru (India). On October 9, 2014, Symantec declared it would split into two independent publicly traded companies by the end of 2015. One company would focus on security, the other on information management . On January 29, 2016, Symantec sold its information-management subsidiary, named Veritas Technologies (which Symantec had acquired in 2004) to The Carlyle Group . The name "Symantec" is a portmanteau of the words " syntax " and " semantics " with "technology". History 1982 to 1989 Founded in 1982 by Gary Hendrix with a National Science Foundation grant, Symantec was originally focused on artificial intelligence -related projects, including a database prog ...more...



List of web service specifications

topic

There are a variety of specifications associated with web services . These specifications are in varying degrees of maturity and are maintained or supported by various standards bodies and entities. These specifications are the basic web services framework established by first-generation standards represented by WSDL , SOAP , and UDDI . Specifications may complement, overlap, and compete with each other. Web service specifications are occasionally referred to collectively as "WS-*", though there is not a single managed set of specifications that this consistently refers to, nor a recognized owning body across them all. “WS-“is a prefix used to indicate specifications associated with web services and there exist many WS* standards including WS-Addressing, WS-Discovery, WS-Federation, WS-Policy, WS-Security, and WS-Trust. This page includes many of the specifications that might be considered a part of "WS-*". Web service standards listings These sites contain documents and links about the different Web Servic ...more...



Misuse case

topic

Misuse Case is a business process modeling tool used in the software development industry. The term Misuse Case or mis-use case is derived from and is the inverse of use case . The term was first used in the 1990s by Guttorm Sindre of the Norwegian University of Science and Technology , and Andreas L. Opdahl of the University of Bergen , Norway. It describes the process of executing a malicious act against a system, while use case can be used to describe any action taken by the system . Overview Use cases specify required behaviour of software and other products under development, and are essentially structured stories or scenarios detailing the normal behavior and usage of the software. A Misuse Case on the other hand highlights something that should not happen (i.e. a Negative Scenario) and the threats hence identified, help in defining new requirements, which are expressed as new Use Cases. This modeling tool has several strengths: It allows you to provide equal weightage to functional and non-functional ...more...



Enterprise JavaBeans

topic

Enterprise JavaBeans ( EJB ) is one of several Java APIs for modular construction of enterprise software . EJB is a server-side software component that encapsulates business logic of an application. An EJB web container provides a runtime environment for web related software components, including computer security , Java servlet lifecycle management , transaction processing , and other web services . The EJB specification is a subset of the Java EE specification. Specification The EJB specification was originally developed in 1997 by IBM and later adopted by Sun Microsystems (EJB 1.0 and 1.1) in 1999 and enhanced under the Java Community Process as JSR 19 (EJB 2.0), JSR 153 (EJB 2.1), JSR 220 (EJB 3.0), JSR 318 (EJB 3.1) and JSR 345 (EJB 3.2). The EJB specification intends to provide a standard way to implement the server-side (also called " back-end ") 'business' software typically found in enterprise applications (as opposed to 'front-end' user interface software). Such machine code addresses the same type ...more...



Dublin Core

topic

The Dublin Core Schema is a small set of vocabulary terms that can be used to describe web resources (video, images, web pages, etc.), as well as physical resources such as books or CDs, and objects like artworks. The full set of Dublin Core metadata terms can be found on the Dublin Core Metadata Initiative (DCMI) website. The original set of 15 classic metadata terms, known as the Dublin Core Metadata Element Set (DCMES), is endorsed in the following standards documents: IETF RFC 5013 ISO Standard 15836-2009 NISO Standard Z39.85 Dublin Core metadata may be used for multiple purposes, from simple resource description to combining metadata vocabularies of different metadata standards , to providing interoperability for metadata vocabularies in the linked data cloud and Semantic Web implementations. Background "Dublin" refers to Dublin, Ohio , USA where the schema originated during the 1995 invitational OCLC/NCSA Metadata Workshop, hosted by the OCLC (Online Computer Library Center), a library consortium b ...more...



Smart products

topic

Recent innovations in mobile and sensor technologies allow for creating a digital representation of almost any physical entity and its parameters over time at any place. RFID technologies, for instance, are used to ground digital representations, which are used to track and geo-reference physical entities. In general, physical worlds and digital representations become tightly interconnected, so that manipulations in either would have effect on the other. Integration of information and communication technologies into products anywhere and anytime enable new forms of mobile marketing in respect to situated marketing communication, dynamic pricing models and dynamic product differentiation models. As Fano and Gershman state: "Technology enables service providers to make the location of their customers the location of their business". Smart products are specializations of hybrid products with physical realizations of product categories and digital product descriptions that provide the following characteristics: S ...more...



Entity–relationship model

topic

An entity–relationship model ( ER model ) describes inter-related things of interest in a specific domain of knowledge. An ER model is composed of entity types (which classify the things of interest) and specifies relationships that can exist between instances of those entity types. An entity–relationship diagram for an MMORPG using Chen's notation. In software engineering an ER model is commonly formed to represent things that a business needs to remember in order to perform business processes. Consequently, the ER model becomes an abstract data model that defines a data or information structure that can be implemented in a database , typically a relational database . Entity–relationship modeling was developed for database design by Peter Chen and published in a 1976 paper. However, variants of the idea existed previously. Some ER modelers show super and subtype entities connected by generalization-specialization relationships, and an ER model can be used also in the specification of domain-specific ontol ...more...



Common Object Request Broker Architecture

topic

The Common Object Request Broker Architecture ( CORBA ) is a standard defined by the Object Management Group (OMG) designed to facilitate the communication of systems that are deployed on diverse platforms. CORBA enables collaboration between systems on different operating systems, programming languages , and computing hardware. CORBA uses an object-oriented model although the systems that use the CORBA do not have to be object-oriented. CORBA is an example of the distributed object paradigm. Overview CORBA enables communication between software written in different languages and running on different computers. Implementation details from specific operating systems, programming languages, and hardware platforms are all removed from the responsibility of developers who use CORBA. CORBA normalizes the method-call semantics between application objects residing either in the same address-space (application) or in remote address-spaces (same host, or remote host on a network). Version 1.0 was released in October 1 ...more...



Data virtualization

topic

Data virtualization is any approach to data management that allows an application to retrieve and manipulate data without requiring technical details about the data, such as how it is formatted at source, or where it is physically located, and can provide a single customer view (or single view of any other entity) of the overall data. Unlike the traditional extract, transform, load ("ETL") process, the data remains in place, and real-time access is given to the source system for the data. This reduces the risk of data errors, of the workload moving data around that may never be used, and it does not attempt to impose a single data model on the data (an example of heterogeneous data is a federated database system ). The technology also supports the writing of transaction data updates back to the source systems. To resolve differences in source and consumer formats and semantics, various abstraction and transformation techniques are used. This concept and software is a subset of data integration and is common ...more...



Global village

topic

The Global village is a metaphoric shrinking of the world into a village through the use of telecommunications. Global village is also a term used to express the relation between economics and sociology throughout the world. The term was coined by Canadian-born Marshall McLuhan , popularized in his books The Gutenberg Galaxy: The Making of Typographic Man (1962) and Understanding Media (1964). McLuhan described how the globe has been contracted into a village by electric technology and the instantaneous movement of information from every quarter to every point at the same time. Overview The new reality of the digital age has implications for forming new sociological structures within the context of culture . Interchanging messages, stories, opinions, posts, and videos through channels on telecommunication pathways can cause miscommunication, especially through different cultures. Contemporary analysts question the causes of changes in community. Often they speculate on whether the consequences of these c ...more...



Dawn Raid Entertainment

topic

Dawn Raid Entertainment is a record label based in Papatoetoe , South Auckland , in New Zealand . It has signed many New Zealand Hip-hop and R&B artists such as Savage , Adeaze , Aaradhna , The Deceptikonz , Devolo and Ill Semantics. The founders of Dawn Raid Entertainment announced the closure of the recording label on 19 April 2007 due to liquidation. The label is back in business as of June 2007, due to new owners. Savage , Mareko , The Deceptikonz , Horsemen Family, Sweet & Irie and more new artists are to be announced. In 2008 the police seized all assets of their competitor, Killer Beez -control Colourway Records . In its peak it was home to many gold and platinum-selling New Zealand hip hop and R&B artists. History School friends Danny "Brotha D" Leaosavai'i and Andy Murnane founded Dawn Raid Entertainment in 1999. They started selling T-shirts at a market in Ōtara to raise capital. In 2007, the label went into liquidation for a few months. Sponsorship In May 2004, Dawn Raid became a spo ...more...



Executable UML

topic

Executable UML ( xtUML or xUML ) is both a software development method and a highly abstract software language. It was described for the first time in 2002 in the book "Executable UML: A Foundation for Model-Driven Architecture" . The language "combines a subset of the UML ( Unified Modeling Language ) graphical notation with executable semantics and timing rules." The Executable UML method is the successor to the Shlaer–Mellor method . Executable UML models "can be run, tested , debugged, and measured for performance.", and can be compiled into a less abstract programming language to target a specific implementation . Executable UML supports model-driven architecture (MDA) through specification of platform-independent models , and the compilation of the platform-independent models into platform-specific models . Overview Executable UML is a higher level of abstraction than third-generation programming languages . This allows developers to develop at the level of abstraction of the application. The Execut ...more...



SMW+

topic

SMW+ is an open source software bundle composed of the wiki application MediaWiki along with a number of its extensions, that was developed by the German software company Ontoprise GmbH from 2007 to 2012. In 2012, Ontoprise GmbH filed for bankruptcy and went out of business. DIQA-Projektmanagement GmbH, a start-up founded by former Ontoprise employees, now offers support for the software in SMW+, though under the name " DataWiki ". Details SMW+'s extensions include, most notably, Semantic MediaWiki and the Halo Extension . Cumulatively, SMW+ functions as a semantic wiki , and is also meant to serve as an enterprise wiki for use within companies, for applications such as knowledge management and project management . The SMW+ platform was available in a number of formats including a Windows installer, Linux installer and VMware image. SMW+ emerged from Project Halo , a research project meant to provide a platform for collaborative knowledge engineering for domain experts in the biology , chemistry and physics ...more...



Designation of workers by collar color

topic

Groups of working individuals are typically classified based on the colors of their collars worn at work; these can commonly reflect one's occupation or sometimes gender. White-collar workers are named for the white-collared shirts that were fashionable among office workers in the early and mid-20th century. Blue-collar workers are referred to as such because in the early 20th century, they usually wore sturdy, inexpensive clothing that didn't show dirt easily, such as blue denim or cambric shirts. Various other "collar" descriptions exist, as well. White collar The term "white-collar worker" was coined in the 1930s by Upton Sinclair , an American writer who referenced the word in connection to clerical, administrative and managerial functions during the 1930s. A white-collar worker is a salaried professional, typically referring to general office workers and management. However, in certain developed countries like the United States , the United Kingdom , and Canada , a person is assumed to be a white-collar ...more...



IDEF0

topic

IDEF0 , a compound acronym (Icam DEFinition for Function Modeling, where 'ICAM' is an acronym for Integrated Computer Aided Manufacturing) is a function modeling methodology for describing manufacturing functions, which offers a functional modeling language for the analysis, development, reengineering , and integration of information systems ; business processes ; or software engineering analysis. IDEF0 is part of the IDEF family of modeling languages in the field of software engineering , and is built on the functional modeling language Structured Analysis and Design Technique (SADT). Overview The IDEF0 Functional Modeling method is designed to model the decisions, actions, and activities of an organization or system. It was derived from the established graphic modeling language Structured Analysis and Design Technique (SADT) developed by Douglas T. Ross and SofTech, Inc. In its original form, IDEF0 includes both a definition of a graphical modeling language ( syntax and semantics ) and a description of a c ...more...

Member feedback about IDEF0:

Folder: PSI

Andrei Cazacu (AndreiCazacu)



Resource Description Framework

topic

The Resource Description Framework ( RDF ) is a family of World Wide Web Consortium (W3C) specifications originally designed as a metadata data model . It has come to be used as a general method for conceptual description or modeling of information that is implemented in web resources , using a variety of syntax notations and data serialization formats. It is also used in knowledge management applications. RDF was adopted as a W3C recommendation in 1999. The RDF 1.0 specification was published in 2004, the RDF 1.1 specification in 2014. Overview The RDF data model is similar to classical conceptual modeling approaches (such as entity–relationship or class diagrams ). It is based on the idea of making statements about resources (in particular web resources ) in expressions of the form subject – predicate – object , known as triples . The subject denotes the resource, and the predicate denotes traits or aspects of the resource, and expresses a relationship between the subject and the object . For example, one ...more...



Cowboy coding

topic

Cowboy coding is software development where programmers have autonomy over the development process. This includes control of the project's schedule, languages, algorithms, tools, frameworks and coding style. A cowboy coder can be a lone developer or part of a group of developers working with minimal process or discipline. Usually it occurs when there is little participation by business users, or fanned by management that controls only non-development aspects of the project, such as the broad targets, timelines, scope, and visuals (the "what", but not the "how"). "Cowboy coding" commonly sees usage as a derogatory term when contrasted with more structured software development methodologies . Disadvantages In cowboy coding, the lack of formal software project management methodologies may be indicative (though not necessarily) of a project's small size or experimental nature. Software projects with these attributes may exhibit: Lack of release structure Lack of estimation or implementation planning might cause ...more...



Artifact-centric business process model

topic

Artifact-centric business process model represents an operational model of business processes in which the changes and evolution of business data , or business entities , are considered as the main driver of the processes. The artifact-centric approach, also called activity-centric or process-centric business process modeling, focuses on describing how business data is change/updated, by a particular action or task, throughout the process. Overview In general, a process model describes activities conducted in order to achieve business goals, informational structures, and organizational resources. Workflows , as a typical process modeling approach, often emphasize the sequencing of activities (i.e., control flows ), but ignore the informational perspective or treat it only within the context of single activities. Without a complete view of the informational context, business actors often focus on what should be done instead of what can be done, hindering operational innovations. Business process modeling is a ...more...



IDEF

topic

IDEF methods: part of the systems engineer's toolbox IDEF , initially abbreviation of ICAM Definition , renamed in 1999 as Integration DEFinition , refers to a family of modeling languages in the field of systems and software engineering . They cover a wide range of uses, from functional modeling to data, simulation, object-oriented analysis/design and knowledge acquisition. These "definition languages" were developed under funding from U.S. Air Force and although still most commonly used by them, as well as other military and United States Department of Defense (DoD) agencies, are in the public domain . The most-widely recognized and used components of the IDEF family are IDEF0 , a functional modeling language building on SADT , and IDEF1X , which addresses information models and database design issues. Overview on IDEF methods IDEF refers to a family of modeling language , which cover a wide range of uses, from functional modeling to data, simulation, object-oriented analysis/design and knowledge acquisiti ...more...



HL7 Services Aware Interoperability Framework

topic

This article documents the effort of the Health Level Seven(HL7) community and specifically the HL7 Architecture Board (ArB) to develop an interoperability framework that would support services, messages, and Clinical Document Architecture(CDA) ISO 10871. HL7 provides a framework and standards for the exchange, integration, sharing, and retrieval of electronic health information. SAIF Overview The HL7 Services-Aware Interoperability Framework Canonical Definition (SAIF-CD) provides consistency between all artifacts, and enables a standardized approach to enterprise architecture (EA) development and implementation, and a way to measure the consistency. SAIF is a way of thinking about producing specifications that explicitly describe the governance, conformance, compliance, and behavioral semantics that are needed to achieve computable semantic working interoperability. The intended information transmission technology might use a messaging, document exchange, or services approach. SAIF is the framework that ...more...



Knowledge Discovery Metamodel

topic

Knowledge Discovery Metamodel ( KDM ) is a publicly available specification from the Object Management Group (OMG). KDM is a common intermediate representation for existing software systems and their operating environments, that defines common metadata required for deep semantic integration of Application Lifecycle Management tools. KDM was designed as the OMG's foundation for software modernization , IT portfolio management and software assurance. KDM uses OMG's Meta-Object Facility to define an XMI interchange format between tools that work with existing software as well as an abstract interface ( API ) for the next-generation assurance and modernization tools. KDM standardizes existing approaches to knowledge discovery in software engineering artifacts, also known as software mining . History In November 2003, the OMG's Architecture-Driven Modernization Task Force recommended, and the Platform Technical Committee issued, the Knowledge Discovery Metamodel (KDM) RFP. The objective of this RFP was to provide ...more...



Ontology (information science)

topic

In computer science and information science , an ontology is a formal naming and definition of the types, properties, and interrelationships of the entities that really exist in a particular domain of discourse . An ontology (in information science) compartmentalizes the variables needed for some set of computations and establishes the relationships between them. The fields of artificial intelligence , the Semantic Web , systems engineering , software engineering , biomedical informatics , library science , enterprise bookmarking , and information architecture all create ontologies to limit complexity and organize information. The ontology can then be applied to problem solving . The knowledge density of a knowledge graph is the average number of attributes and binary relations issued from a given entity, and is commonly measured in facts per entity. Etymology and definition The term ontology has its origin in philosophy and has been applied in many different ways. The word element onto- comes from the Greek ...more...



Amit Sheth

topic

Dr. Amit Sheth is a computer scientist at Wright State University in Dayton , Ohio . He is the Lexis Nexis Ohio Eminent Scholar for Advanced Data Management and Analysis. Up to June 2017, Sheth's work has been cited by 37,304 publications. He has an h-index of 95, which puts him among the top 100 computer scientists with the highest h-index . Prior to founding the Kno.e.sis Center , he served as the director of the Large Scale Distributed Information Systems Lab at the University of Georgia in Athens, Georgia . Education Sheth received his bachelor's in engineering from the Birla Institute of Technology and Science in computer science in 1981. He received his M.S. and Ph.D. in computer science from the Ohio State University in 1983 and 1985, respectively. Research Semantic interoperability/integration and semantic web Sheth has investigated, demonstrated, and advocated for the comprehensive use of metadata . He explored syntactical, structural, and semantic metadata; recently, he has pioneered ontology-d ...more...



Model-driven engineering

topic

Model-driven engineering ( MDE ) is a software development methodology that focuses on creating and exploiting domain models , which are conceptual models of all the topics related to a specific problem. Hence, it highlights and aims at abstract representations of the knowledge and activities that govern a particular application domain , rather than the computing (f.e. algorithmic) concepts. Overview The MDE approach is meant to increase productivity by maximizing compatibility between systems (via reuse of standardized models), simplifying the process of design (via models of recurring design patterns in the application domain), and promoting communication between individuals and teams working on the system (via a standardization of the terminology and the best practices used in the application domain). A modeling paradigm for MDE is considered effective if its models make sense from the point of view of a user that is familiar with the domain, and if they can serve as a basis for implementing systems. The m ...more...



Ronald Stamper

topic

Ronald K. (Ron) Stamper (born 1934) is a British computer scientist, formerly a researcher in the LSE and Emeritus Professor at the University of Twente , known for his pioneering work in Organisational semiotics , and the creation of the MEASUR methodology and the SEDITA framework. Biography Born in West Bridgford , United Kingdom , Stamper obtained his MA in Mathematics and Statistics at Oxford University in 1959. Stamper started his career in industry, first in hospital administration and later in the steel industry . He starting applying operational research methods with the use of computers, and evolved into the management of information systems development. In need of more experts, he developed one of the first courses in systems analysis in the UK. In 1969 he moved into the academic world, starting at the London School of Economics as Senior Lecturer and Principal Investigator. From 1988 to 1999 he was Professor of Information Management at the University of Twente at its Faculty of Technology and Mana ...more...



List of academic fields

topic

Mind map of top level disciplines and professions The following outline is provided as an overview of and topical guide to academic disciplines: An academic discipline or field of study is a branch of knowledge that is taught and researched as part of higher education . A scholar's discipline is commonly defined and recognized by the university faculties and learned societies to which he or she belongs and the academic journals in which he or she publishes research . However, no formal criteria exist for defining an academic discipline. Disciplines vary between well-established ones that exist in almost all universities and have well-defined rosters of journals and conferences and nascent ones supported by only a few universities and publications. A discipline may have branches, and these are often called sub-disciplines. There is no consensus on how some academic disciplines should be classified (e.g., whether anthropology and linguistics are disciplines of social sciences or fields within the humanities ). ...more...



Outline of databases

topic

The following outline is provided as an overview of and topical guide to databases : Database – organized collection of data , today typically in digital form. The data are typically organized to model relevant aspects of reality (for example, the availability of rooms in hotels), in a way that supports processes requiring this information (for example, finding a hotel with vacancies). What type of things are databases? Databases can be described as all of the following: Information – sequence of symbols that can be interpreted as a message. Information can be recorded as signs, or transmitted as signals. Data – values of qualitative or quantitative variables, belonging to a set of items. Data in computing (or data processing) are often represented by a combination of items organized in rows and multiple variables organized in columns. Data are typically the results of measurements and can be visualised using graphs or images. Computer data – information in a form suitable for use with a computer. Data is oft ...more...



MarkLogic

topic

MarkLogic Corporation is an American software business that develops and provides an enterprise NoSQL database, also named MarkLogic. The company was founded in 2001 and is based in San Carlos , California . MarkLogic is a privately held company with over 500 employees and has offices throughout the United States , Europe , Asia , and Australia . MarkLogic has over 550 customers, including Comcast , Deutsche Bank , Erie Insurance Group , Johnson & Johnson , and the US Army . Also, six of the top ten global banks are MarkLogic customers. According to Forrester Research , MarkLogic is among the NoSQL databases vendors with the strongest offerings in the market and regularly appears in Gartner Leaders Quadrant in the Magic Quadrant for Operational Database Management Systems. History MarkLogic was first named Cerisent and was founded in 2001 by Christopher Lindblad, who was the Chief Architect of the Ultraseek search engine at Infoseek , and Paul Pedersen, a professor of computer science at Cornell Univ ...more...



RSI

topic

RSI may refer to: Business RADARSAT International , a provider of data and information derived from a Canadian remote-sensing Earth observation satellite program overseen by the Canadian Space Agency Relative strength index , a technical indicator used in the analysis of financial markets RSI Corporation , RadioFrequency Safety International, a safety firm specializing in OSHA/FCC radio frequency (RF) compliance Relational Semantics, Inc , a U.S. software company specializing in case management systems for state judiciaries Science and technology RSI register , a 64-bit processor register of x86 CPUs Records Series Identifiers, a method used in records management for applying retention and follow-up information for electronic documents Review of Scientific Instruments , a scientific journal Repetitive strain injury , a disorder affecting muscles, tendons and nerves from repetitive movements, forceful exertions, vibrations, mechanical compression, or sustained/awkward positions. Relative strength index Researc ...more...



WS-Policy

topic

WS-Policy is a specification that allows web services to use XML to advertise their policies (on security , quality of service , etc.) and for web service consumers to specify their policy requirements. WS-Policy is a W3C recommendation as of September 2007. WS-Policy represents a set of specifications that describe the capabilities and constraints of the security (and other business) policies on intermediaries and end points (for example, required security tokens, supported encryption algorithms, and privacy rules) and how to associate policies with services and end points. Policy Assertion Assertions can either be requirements put upon a web service or an advertisement of the policies of a web service. Operator tags Two "operators" (XML tags) are used to make statements about policy combinations: wsp:ExactlyOne - asserts that only one child node must be satisfied. wsp:All - asserts that all child nodes must be satisfied. Logically, an empty wsp:All tag makes no assertions. Policy Intersection If both provid ...more...



Oracle Spatial and Graph

topic

Oracle Spatial and Graph , formerly Oracle Spatial, forms a separately-licensed option component of the Oracle Database . The spatial features in Oracle Spatial and Graph aid users in managing geographic and location-data in a native type within an Oracle database, potentially supporting a wide range of applications — from automated mapping , facilities management , and geographic information systems ( AM/FM/GIS ), to wireless location services and location-enabled e-business . The graph features in Oracle Spatial and Graph include Oracle Network Data Model (NDM) graphs used in traditional network applications in major transportation , telcos, utilities and energy organizations and RDF semantic graphs used in social networks and social interactions and in linking disparate data sets to address requirements from the research, health sciences, finance, media and intelligence communities. Components The geospatial feature of Oracle Spatial and Graph provides a SQL schema and functions that facilitate the storage ...more...



Software development

topic

Software development is the process of computer programming , documenting , testing , and bug fixing involved in creating and maintaining applications and frameworks resulting in a software product . Software development is a process of writing and maintaining the source code , but in a broader sense, it includes all that is involved between the conception of the desired software through to the final manifestation of the software, sometimes in a planned and structured process. Therefore, software development may include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities that result in software products. Software can be developed for a variety of purposes, the three most common being to meet specific needs of a specific client/business (the case with custom software ), to meet a perceived need of some set of potential users (the case with commercial and open source software ), or for personal use (e.g. a scientist may write software to automate a ...more...



Peyman Faratin

topic

Peyman Faratin (born September 16, 1965) is an Iranian/American computer scientist, and the founder of Robust Links, an Internet company building algorithms for creating and processing a knowledge graph. Background Peyman completed his PhD in computer science under the supervision of Prof. Nicholas R. Jennings and Prof. Carles Sierra. He made significant contributions in the area of artificial intelligence, particularly to automated negotiation in multi-agent systems . He was then a research scientist at MIT 's Computer Science and Artificial Intelligence (CSAIL) laboratory, working with David D. Clark in the Advanced Network Architecture group. Peyman has over eighteen years of experience in design and implementation of online marketplaces. He graduated from University of London (EECS department) in 2000 completing his doctoral thesis on algorithms for online bargaining and auction mechanisms, with application to business process management and supply chain management in telecommunication domains. Between 20 ...more...



Ontology engineering

topic

Ontology engineering in computer science and information science is a field which studies the methods and methodologies for building ontologies : formal representations of a set of concepts within a domain and the relationships between those concepts. A large-scale representation of abstract concepts such as actions, time, physical objects and beliefs would be an example of ontological engineering. Ontology engineering is one of the areas of applied ontology , and can be seen as an application of philosophical ontology . Core ideas and objectives of ontology engineering are also central in conceptual modeling . Overview “ Ontology engineering aims at making explicit the knowledge contained within software applications, and within enterprises and business procedures for a particular domain. Ontology engineering offers a direction towards solving the inter-operability problems brought about by semantic obstacles, i.e. the obstacles related to the definitions of business terms and software classes. Ontology eng ...more...



Machine-Readable Documents

topic

Machine-readable documents are documents whose content can be readily processed by computers . Such documents are distinguished from machine-readable data by virtue of having sufficient structure to provide the necessary context to support the business processes for which they are created. Definition Data without context (language use) is meaningless and lacks the four essential characteristics of trustworthy business records specified in ISO 15489 Information and documentation -- Records management : Reliability Authenticity Integrity Usability The vast bulk of information is unstructured data and, from a business perspective, that means it is "immature", i.e., Level 1 (chaotic) of the Capability Maturity Model . Such immaturity fosters inefficiency, diminishes quality, and limits effectiveness. Unstructured information is also ill-suited for records management functions, provides inadequate evidence for legal purposes, drives up the cost of discovery in litigation , and makes access and usage needlessly cum ...more...



Middleware (distributed applications)

topic

Middleware in the context of distributed applications is software that provides services beyond those provided by the operating system to enable the various components of a distributed system to communicate and manage data. Middleware supports and simplifies complex distributed applications . It includes web servers , application servers , messaging and similar tools that support application development and delivery. Middleware is especially integral to modern information technology based on XML , SOAP , Web services , and service-oriented architecture . Middleware often enables interoperability between applications that run on different operating systems, by supplying services so the application can exchange data in a standards-based way. Middleware sits "in the middle" between application software that may be working on different operating systems . It is similar to the middle layer of a three-tier single system architecture, except that it is stretched across multiple systems or applications. Examples incl ...more...



Cloud Infrastructure Management Interface

topic

Cloud Infrastructure Management Interface ( CIMI ) is an open standard API specification for managing cloud infrastructure. CIMI's goal is to enable users to manage cloud infrastructure in a simple way by standardizing interactions between cloud environments to achieve interoperable cloud infrastructure management between service providers and their consumers and developers. CIMI 1.1 was registered as an International Standard in August 2014 by the Joint Technical Committee 1 (JTC 1) of the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC). Overview The CIMI standard is defined and published by the Distributed Management Task Force (DMTF). It includes the Cloud Infrastructure Management Interface (CIMI) Model and RESTful HTTP-based Protocol specification, the CIMI XML Schema , the CIMI Primer and the CIMI Uses Cases whitepaper: Cloud Infrastructure Management Interface (CIMI) Model and RESTful HTTP-based Protocol CIMI XML Schema CIMI Primer CIMI Use ...more...



Business Process Execution Language

topic

The Web Services Business Process Execution Language ( WS-BPEL ), commonly known as BPEL ( Business Process Execution Language ), is an OASIS standard executable language for specifying actions within business processes with web services . Processes in BPEL export and import information by using web service interfaces exclusively. Overview One can describe Web-service interactions in two ways: as executable business processes and as abstract business processes. An executable business process: models an actual behavior of a participant in a business interaction. An abstract business process: is a partially specified process that is not intended to be executed. Contrary to Executable Processes, an Abstract Process may hide some of the required concrete operational details. Abstract Processes serve a descriptive role, with more than one possible use case , including observable behavior and/or process template . WS-BPEL aims to model the behavior of processes, via a language for the specification of both Executa ...more...



SQL

topic

SQL (  (   listen ) S-Q-L, "sequel"; Structured Query Language ) is a domain-specific language used in programming and designed for managing data held in a relational database management system (RDBMS), or for stream processing in a relational data stream management system (RDSMS). In comparison to older read/write APIs like ISAM or VSAM , SQL offers two main advantages: first, it introduced the concept of accessing many records with one single command; and second, it eliminates the need to specify how to reach a record, e.g. with or without an index. Originally based upon relational algebra and tuple relational calculus , SQL consists of many types of statements, which may be informally classed as sublanguages , commonly: a data query language (DQL), a data definition language (DDL), a data control language (DCL), and a data manipulation language (DML) . The scope of SQL includes data query, data manipulation (insert, update and delete), data definition ( schema creation and modification), and data a ...more...



Data lake

topic

A data lake is a method of storing data within a system or repository, in its natural format, that facilitates the collocation of data in various schemata and structural forms, usually object blobs or files. The idea of data lake is to have a single store of all data in the enterprise ranging from raw data (which implies exact copy of source system data) to transformed data which is used for various tasks including reporting , visualization , analytics and machine learning . The data lake includes structured data from relational databases (rows and columns), semi-structured data (CSV, logs, XML, JSON), unstructured data (emails, documents, PDFs) and even binary data (images, audio, video) thus creating a centralized data store accommodating all forms of data. A data swamp is a deteriorated data lake, that is inaccessible to its intended users and provides little value. Background James Dixon, then chief technology officer at Pentaho allegedly coined the term to contrast it with data mart , which is a smalle ...more...



Bernhard Thalheim

topic

Bernhard Karl Thalheim (born 1952) is a German computer scientist and Professor of Information Systems Engineering at the University of Kiel who is known for conceptual modeling and its theoretical foundational contributions. Biography Born in Radebeul near Dresden , Germany, Thalheim received his M.Sc. in Mathematics and Computer science in 1975 at the Dresden University of Technology , his PhD in Discrete mathematics in 1979 at the Lomonosov Moscow State University , and his Habilitation in Theoretical computer science in 1985 at the Dresden University of Technology . From 1986 to 1989 Thalheim was an Associate Professor at the Dresden University of Technology. In 1989 he moved to the University of Rostock , where he was Professor until 1993. From 1993 to 2003 he was Dean and Full Professor at the Brandenburg Technical University , and since 2003 he is Professor at the Christian-Albrechts-Universität of Kiel. Thalheim has been Visiting Professor at the Kuwait University ; at the University of Klagenfurt , A ...more...




Next Page
Javascript Version
Revolvy Server https://www.revolvy.com