Logical database design to complete a variety of ways, including top-down, bottom-up approach, combined with the method. The traditional approach, especially a relational database, has been a low level, bottom-up activities, various elements into normalized tables after a careful analysis of the data
Defines the data elements of interdependence in the needs analysis. Although traditional craft has a bit of a successful small-to medium-sized databases, large database, but the complexity of using it can be overwhelming proportions level designers don’t have to bother to use it with any regularity. In practice, combined with top-down and bottom-up approach is used, in most cases, the table can demand analysis directly from the definition.
Data model concept has been the most successful tool for communication between designers and end users in needs analysis and logic design time. Its success is due to this fact, the model, that is, using estrogen or UML , easy to understand and easy to represent. Its effectiveness because it is a top-down approach, using an abstract concept. The number of entities in a database, much less than a single data element, because the data elements typically represent attributes. Therefore, use the unit as an abstract data elements and focus on the relationship between entities significantly reduced the number of objects to be considered, and simplify analysis. Although it is still necessary to represent data element properties of entities in the conceptual level, their dependencies, usually limited to other properties of entities, or in some cases, and other entities that are associated with the property and is directly related to their entity. Attribute dependencies among mainly occurs in the data model of the dependencies between entities, this is a unique identifier for the different entities, is captured in the conceptual data modeling process. Special circumstances, such as data element dependencies-independent entities, can be processed, they are in the subsequent data analysis. Logical database design method using here defined conceptual data model and relational model of continuous phase. Thanks to an easy-to-use a conceptual data model structure and the related relationship model of formalism. In order to promote this approach, it is necessary to establish a framework for rebuilding various conceptual data model constructs into table has been normalized, standardization, you can change the minimum. This type of beauty lies in its transformation results normalized, standardized SQL table almost from the beginning, frequent, further normalization is not necessary. In doing so, however, we first need to define the logical relationship between the main steps in the context of the database life cycle.
Needs analysis is an extremely important step in the life cycle in the database, and is often the most labour-intensive. Database designer must interview end users group, and to determine what database to use for it must contain the content. The needs analysis of the basic objectives:
Designated data requirements in the calculation of the base data elements
For a description of data elements and relations between them requires these data models.
To determine the type of transactions for execution on the database and transaction between interaction and data elements
Define any performance, integrity, security or administrative restrictions must be imposed as a result of the database.
To specify any design and implementation of constraints, such as specific technical,
Hardware and software, programming languages, policies, standards, or external interface
To keep detailed records of all the preceding a detailed requirements specification. Data elements can be defined in the data dictionary system, typically provide an integral part of the database management system
Conceptual data model can help designers to accurately capture the real data, because it needs to focus on semantic detailed data relationships, it is greater than the details will be provided by the functional dependence (file descriptor) alone. The emergency room of the semantic model, for example, allows direct conversion of entities and relationships to at least first normal form (1NF ) table. They also provide clear guidelines for integrity constraints. In addition, abstract technologies, such as promotional offers useful tools, integrated end user views, define a global conceptual model
An important component of the database design process steps II ( b ), Integration of different users to browse to a unified, no redundant global schema. The views of individual end users represented a conceptual data model and concept model of comprehensive analysis results is not the end user to resolve any divergence of views and terminology. Experience has shown that almost every situation can be resolved in a meaningful way, through the integration technology. Occurs when a different model of the diversity of users or user groups to develop their own unique perspective of the world, or at least is represented by enterprises in the database. For example, the marketing departments often have the entire product as the basic unit of sales, Engineering Department can concentrate on the individual parts of the entire product. In another case, a user can view a project, its objectives and provisions in order to achieve these goals progress over time, but other users can view an item, its resource needs and the personnel involved. Cause such differences in conceptual models, there seems to be not compatible with the relations and terminology. These differences appear in the conceptual data model, as different levels of abstraction, the connection between (one to many, many to many, and so on), or to the same concept was modeling as a entity, attribute or relation based on the user’s point of view.
The assumption of design:
Production of logical data model as an intermediate delivery Instead of direct physical data models:
Because it produced a explicit conversion, A conceptual data model, a logical data model reflects business information requirements are not affected by any changes to the performance of the cover, in particular, it reflects the rules of the relevant property data (such as functional dependence). These rules often cannot deduce from the physical data model, it might have been demoralized or otherwise compromised.
If the database is transplanted to another DBMS support similar structure (for example, another relational database management system or a new version of the same database management systems have different Performance attributes), logical data model can be used as the base of new physical data model.
Task transformation of conceptual data model, a logical model, is a very simple – Certainly more so than conceptual modeling phase – And that even the large model, is not likely to be more than a few days. In fact, many calculators aided software engineering (CASE ) tools provide facilities for the logical data model, automatically generated from the conceptual model.
We need a number of changes ; Some lent their substitute, thus you need to make decisions while others primarily machinery. We describe these two types of details. In General, the decision does not require Enterprise input, and this is why we delayed until this time.
If you are using a database management system is not based on a simple relational model, you will need to adjust the principles and techniques described here in order to adapt to the specific product. However, the basic relationship model represents the most close to a universal, simple view structured data calculator for, and has a very good example of production data model as a temporary delivery, even if the target is not a relational database management system. Start here, unless otherwise qualified, the term logic model should be seen as referring to the relational model. Similarly, if you are using CASE tools to enforce specific conversion rules, or even do not allow separate concepts and logic model, you will need to adjust your approach. In any case, even if the description is probably most mechanical stage in the lifecycle of data modeling, your attitude, not mechanical. Alert modeling will often find problems and challenges, through the early stages of accidentally, it will be necessary to redefine or conceptual model.
At this point, the process is iterative rather than a linear, because we need to address the two tasks of the interdependence between. We cannot specify a foreign key, until we know the primary key table, they point ; On the other hand, some may contain primary key and foreign key columns (this can compensate for some or all of the table’s primary key).
This means that we cannot specify all in our primary key Model, and specify all the foreign key in our model Or vice versa. On the contrary, we Work back and forth
First, we determine the primary key table from independent entity classes (entity classes, these are not good at “many” side of any non-mandatory-to-one relationship ; 6 Loosely speaking, they are “independent” of the entity class). Now, we can achieve all of the foreign key refers back to the table. Doing so will enable us to define a primary key of table entity class represents any dependent on these independent entity class and implement the foreign key points to back them.
Under certain circumstances, an entity class may have been included in the conceptual data model, providing content, there is no actual requirement to keep data application corresponds to the entity classes. It is also possible that the data is being held in other than relational databases, such as no database file, XML flow, and so on.
We do not recommend that you specify the classification of an entity class purely in order to support the class properties in the conceptual modeling phase. However, if you are using a conceptual model that contains the entity class, you should not have to perform a table at this stage, but to postpone action until the next phase of logic design, so that all class properties, let’s look together, unanimously decided to propose
A many-to-many relationship can be expressed as an additional entity class associated with the two original entity classes one to many relationship.
A database management system that supports SQL99 In set type of structural characteristics of the network user to implement a many-to-many relationship, without creating an additional table. However, we do not recommend such a structure, including your logical data model. The decision whether to use such a structure should take the time physical database design
Entities – Relations, here we are using does not support direct representative relationship, involving three or more entity types ( n – Element relationships). If we encountered this relationship in the conceptual model phase, we will be forced to use their intersecting entities class, it is expected that implementation. There is nothing more to do at this stage, since the standard conversion table of entity class will include the entity class. However, the normalization you should check ; This structure offers the most common case of form data is good, but in the third, fourth or fifth irregular forms. If you are using UML (or other conventions that support N- element relationships), you will need to address the relationship (that is, on behalf of each n- $ relationship, intersection table).
Relational model and relational database management system does not provide direct support for the subtype or supertype. Therefore, any subtypes include conceptual data models are often replaced by the standard relational structure of the logical data model. Since we are retained by the conceptual data model, we do not lose business rules and other requirements for representatives of subtypes, we created this mode. This is important because there is more than one way to represent a supertype / subtype is set in a logical data model and our decision represents the set of possible needs in the light of new information (such as changing transactions, other changes to business processes, or the provision of new facilities by DBMS ), or if the system is portable to different DBMS . , In fact, if DBMS support of new subtypes, direct supertypes and subtypes can remain in the logical data model ; In SQL99 (Of ANSI / ISO / Meet IEC 9075 ) Standard provides direct support for subtypes and at least one object-relational database management system provides this support.
Conversely, if the super type consists of more than one subtype of table, instead of your own tables, any foreign key relationship, for the super type can be any value from any subtype in the table. Referential integrity relationships to the supertype, so you can only manage the program logic:
Another factor is the ability to replace the current data. We do not always access a relational database table directly. Usually we visit their views, including the data from one or more table merge or selected in a variety of ways. We can take advantage of the facilities standards for the construction of views, in the current data subtype or supertype level, whether we choose to implement the subclass, super type, or both. However, there are some limitations. Not all views, so that the data being updated. This is sometimes due to restrictions imposed by a specific database management system, but there are some logical constraints on what type of advice can be updated. In particular, these data appear to be merged from multiple tables, this is not possible to clearly explain the terms in the order in which the underlying tables to be updated. It is beyond the scope of the discussion view of the building and its limitations in any detail. Extensive.
Dams, can provide a different facility for one or other more attractive option. Build a useful ability, updatable views into another factor in the choice of implementation options, is the most appropriate. What is important, however, to recognize the views cannot be a substitute for careful modeling subtypes and super type, and to consider the appropriate level. Useful data classification and identification of the part data modeling process, should be left to a later task view definition. If subtypes and super types do not recognize the concept of modeling phase, we cannot expect the process model to take advantage of them. There’s a small point of view, unless we use in the building plan in our plans.
If a super types are implemented as a table, and at least one of its subtypes as well as for the table, any process that creates an instance of subtype (or
One of its subtypes) must establish the appropriate Superovulation in type table and the appropriate subtype row table ( s ¼‰ã€‚ In order to ensure that this occurs, those responsible for the preparation of detailed specifications (we assume that the terms used in the transaction table level) from the enterprise process specification (we assume that are written in terms of the entity-level transactions), you must notify the rules
The assumption of implementing
Transition from logical to physical database design marks the change in focus and skill needed. Here, we will develop a set of data structures, makes these structures perform specific hardware platform used in facilities, we choose database management system (DBMS ). Instead of the generic data structures and business skills, we need a detailed understanding of general performance tuning techniques and facilities provided by the database management system. This means that the often different, more technical personnel in database design. In this case, the data modeling role will be primarily on the effect of the change, the table and column, which may require, as a last means to achieve performance goals.
Persistent myths about database design is that response times from standardized data retrieval table and column set is more than acceptable. As with all the mythology is a grain of truth in the assertion. Of course, if a large amount of data is retrieved, or if the database itself is very large, or unnecessary complex query or data is not the appropriate index, a slow response time might cause. However, there are many tasks to do, you can adjust the database, and thoroughly unique query in morale, or other modification of table and column definitions on a logical data model to become necessary. This has become more and more real computer overall performance has improved, the database management system designers the ability to continue to develop their own optimizer ( built-in software within database management system to select the most effective means of implementing each queries).
Database designer be interested not only in the tables and columns, and infrastructure components, Pointer and physical storage mechanism , Support data management, and performance requirements. Because the program logic is only dependent on the tables and columns (and their views on the basis of), the set of components commonly referred to as logical schema
If you are a professional modeling of data, you might want to skip this, because it involves many of the tools and the work of the physical database design. We encourage you not to do so. One of the key factors to get good results, is in the physical database design for communication between the levels of and respect for database design and data modeling. This means understanding what the other party, and how to do this. Maintain good architects of the latest knowledge of construction materials.