SQL SERVER BLACK BOOK PDF

adminComment(0)

Microsoft SQL Server Black Book - Table of ContentsTo access the contents, click the chapter and section kipentoriber.ga Download Microsoft SQL Server Black Book pdf SQL Server R2 Black Book · SQL-ServerR2-Black-Book. Download Microsoft SQL Server Black Book Download free online book chm pdf.


Sql Server Black Book Pdf

Author:LONI HAYGOOD
Language:English, Arabic, Japanese
Country:Nicaragua
Genre:Environment
Pages:566
Published (Last):15.11.2015
ISBN:726-7-47016-745-4
ePub File Size:28.35 MB
PDF File Size:13.76 MB
Distribution:Free* [*Registration needed]
Downloads:48748
Uploaded by: CLEMENCIA

to this book, email Microsoft Press Book Support at [email protected] Please tell us SQL Server Enhancements for Database Administrators. of the SQL Server Bible, is the president of Pragmatic Works Consulting books on SQL Server, analytics, and SharePoint. Adam rewrote. Microsoft SQL Server Black Book: The Database Designer's and Administrator's Essential Guide to Setting Up Efficient Client-Server Tasks with SQL Server.

I have fought and continue to fight the same battles that you do on a daily basis. That is why I think this book can be such a great value to you! Hopefully my experience will help you develop solid database systems in your Microsoft SQL Server environment. Each chapter is broken into two sections. The first part of each chapter presents explanatory material about the chapter topics. The first part of the chapter ends with a Summary section, which is a bulleted list of the important points of the chapter.

The second part of each chapter the Practical Guide supplies you with some step-by-step tasks that reinforce the content of the chapter and provide hands-on practice.

Chapters 1to3 cover the installation and configuration of Microsoft SQL Server for both development and production environments. Chapter 4 explains the setup and terminology needed to implement replication between SQL servers. Chapters 10, 11, and 12 cover the troubleshooting and tuning skills you will need to support your system over the long haul.

One of the points I emphasize in this book is that you can solve any technical problem you are facing with the tools available to you. What are those tools? How do you research answers to your questions? How do you know if you can trust the sources you consult? How do particular features really work, and will they work for you?

I cover all these questions and more in the pages of this book. I hope that you enjoy reading it this as much as I have writing it. Prerequisites This book is geared toward readers with a broad range of backgrounds. Many readers may have never worked with Microsoft SQL Server before, so I have tried to write a book that can transform a beginner into a power user. At the same time, I have added plenty of advanced concepts and techniques to each chapter that experienced DBAs can use to get your server running like a thoroughbred.

The book assumes a basic understanding of Windows NT. The exercises and examples will run on any machine that can run the client utilities for Microsoft SQL Server. Technical Support If you find any errors or need further help with any topic in this book, you can reach me on the Internet through my email account: pDalton msn. Please do not hesitate to send me feedback—whether positive or negative—concerning this book. Acknowledgments I would like to thank a few people that have been key in my personal and professional life.

First, I would like to thank my wife Diane. She has provided support day in and day out for years now. Never mind, they already know. This is best for your users. One of the hardest things to do is troubleshoot problems when you have no idea what is working and what is not. Assume nothing. I have always used the break-it-down-into-the-simplest-form approach in troubleshooting.

If you cannot get out on the network, no one will be able to connect to your server for data. Consider using a redundant power supply and surge protection for your unit. Use a UPS that is reliable and test it occasionally. A untested backup strategy is just that: Finally If you are setting up a system for a third-party application, beware of the requirements for that system. Find out ahead of time what special configurations will need to be set to let a particular piece of software run well.

You might find that a system configured to run one application well might not allow another to run well at all. If this case arises, contact the vendor to determine how you can adjust your server configuration to best provide for both situations.

You cannot see all failures or issues ahead of time; we are all human and make mistakes. By preparing yourself for these mistakes, you will know where to look, how to look, and what action to take to solve the problems that arise.

The largest compliment I can get as an instructor is to have students call me or email me with a problem, tell me what they think they should do to fix it, and be right on target.

This shows they are thinking things through and coming to the right conclusions; they are just lacking the confidence which will come with time to take the actions needed to fix the problem. The fact you are reading this book shows you are making an effort to be prepared.

Previous Table of Contents Next Summary The information in this chapter might seem like a lot to think about before you install the server software, but it will help you make better decisions about the type of machine you need.

Following are the major points: This is not always possible, but it should be the goal to work toward. Use at least 32MB in a production server for best results. Upon installation, set your master database to 30MB to save time later.

Faster is not always better. This disk configuration will give you a very flexible and fault-tolerant system. Remember that the memory option is in 2K increments. Even though there are some known inconsistencies in them, they are a great resource that is underutilized by many professionals.

How much depends on http: Preinstallation Considerations your system and the way it will be used. Be cautious about putting Tempdb in RAM. Test this before doing it; you might not gain as much as you think. Practical Guide to Preinstallation This section will walk you through gathering some preinstallation information about your system that will help you install your server right the first time.

The Preinstallation Checklist The preinstallation checklist preinstall. This document will help you get through the installation process as painlessly as possible. I will choose an Intel-based Pentium MHz single-processor machine with two hard drives—one with 1.

The preinstallation checklist is a memory-aid document. Each line of the document has a comment or question that should be answered in whole or in part before installing Microsoft SQL Server. The Windows NT Section The Windows NT section is intended to remind us of the operating system issues that might come up during the installation and configuration of our server.

If your hardware does not exist on the approved list, you should be careful; things might go wrong. However, just because a piece of hardware is not on the list does not mean it will not work. You should check whether NT will install and run properly. Keep in mind that Microsoft will not support hardware configurations that contain any component not on the Hardware Compatibility List.

This fact should be enough to make you stay away from nonsupported components when running Windows NT. Most of the issues in this section will become common sense once you have set up a NT server and configured it for running SQL Server. This is intended to help those of you who might not be as familiar with Microsoft Windows NT as you would like.

The first area of the preinstallation checklist is displayed in Figure 1. Preinstallation Considerations Figure 1. What is covered in this section are the key entries necessary for your server to exist on an IP-based network. Each machine on an IP network needs a unique address. This address is like a street address for your house. The mail carrier needs to know what address to deliver mail to—the same applies to your machine. The network must know your address to deliver packets.

Then there is the default gateway entry. This entry is the primary exit point for your network. To understand this concept, you can compare it to your local post office knowing what other post offices to send packages to.

In a lot of networks, this entry is reserved for the Putting the gateway at address 1 makes life easier when you need to remember the address of the gateway. Internet names are the This is not a required entry, but it is a valuable one from a network administration standpoint.

The dynamic host configuration protocol server leases addresses dynamically to machines on your network that are DHCP-aware. This eliminates the need for static IP addresses on a network and can be a great benefit in a large enterprise.

Last in the list is an entry for the DNS server, or the domain name server. This server allows you to have a central list of names and IP addresses stored in a server environment so that you can query this server for a name or IP address instead of maintaining a separate LMHOSTS file on each machine.

Choose the configuration that most closely matches your server and follow the information regarding http: This chapter is intended to be an installation tutorial. If you feel confident installing Microsoft SQL Server, you might want to skip to the Summary section for this chapter, located just before the Practical Guide. By reviewing the Summary, you can see if you need to go back into the chapter to review any installation steps you might be unsure of.

Frequently such departments have grown into a need for client-server databases. This type of machine would also handle replication distribution tasks very well for a medium-sized enterprise. See Chapter 4 for an extensive discussion of replication.

One other point to consider for this type of server is using it in conjunction with the Distributed Transaction Coordinator DTC. These servers can become valuable in supporting your needs. Also, by separating processes across multiple servers, you can include some legacy systems and save money.

With a little planning, even these smaller servers can be very useful in a production environment. From an audit trail or tracking standpoint to a centralized error-handling and task management scenario, these servers are becoming a part of many enterprises. Learning how to integrate them into your plans will make you look very good in the eyes of any management group trying to resolve growth issues at lower costs.

You may want to look for a network task server application to help manage your network resources and extend the life of your legacy hardware. Base-level server configurations would typically involve an ibased server, possibly with 48 to 96MB of RAM. Even a smaller amount of RAM should not pose a real problem for most basic environments.

Keep in mind, however, that adding RAM to one of these servers can have a great impact on performance and should be one of the first things you consider. Also included in this group are slower Pentium machines. These lower-speed Pentium servers will surely become commonplace in smaller enterprises. When specifying a server for your needs, always consider the environment and the clients your machine will be servicing.

When choosing a smaller server, you should keep the amount of available http: Keep the additional tasks of file, print, and gateway services off these types of servers. You may notice a slower connection time with these lighter-weight servers. However, once you have connected to one of these servers, you should get acceptable query response times under a light load. This is usually exactly the machine a small office or satellite location needs from a data server.

Since this is an entry-level server, we will use a single hard drive with MB of free space left after installing Windows NT. This server will run Windows NT Server 4. Take special care during installation of the operating system to make the correct installation choices for the network. Accidentally installing NT servers as domain controllers is easy to do and should be avoided. Avoid putting any additional overhead on this type of server. Starting an application or service can place a burden on the functionality of the operating system or Microsoft SQL Server and bring your data services to a crawl.

This will negate the need for installing any type of tape backup system. Since the system will back up data while users are in the database, you should be aware of when these backups will occur. Scheduling your backups around peak activity is important with a server of this size. Take the time to ensure you can expand your server down the road. By definition, databases will always grow, and there are few things that you can do to prevent a good system from slowing down over time with increased use.

Making good downloading decisions is as important as hiring good people to fill critical positions in your company. In addition, spending money on as current a processor architecture as possible at this level is a good idea.

Take the time to look at what machine will supply the most processing bang for your buck. Check the results of independent tests for different machines. I have had very good luck going with the established name brands.

I also insist on good technical support. Regardless of the make of your server, these white papers do a very thorough job of helping you configure it to run well in a production environment. There is a wealth of information out there for you to use—some of it is good and some is junk.

You http: Configuring data servers can be a difficult task. Adding one nonconforming task or application can throw your performance into a downward spiral. Always begin by breaking down the problem into simplest-possible form. Then look for the obvious problem. Never assume that something is working fine. When I am asked to recommend a server, I try to get a good feel for how the server is going to be used in the future. With this middle-of-the-road class of server, you must start to look at fine-tuning your hardware choices for maximum performance.

If you cannot afford to download all of the components for a fault-tolerant server at one time, you should plan their addition as budgets permit. In many scenarios in this range, it is a good practice to propose during the bid process some sort of plan to upgrade or migrate the server hardware as the use and load of the server matures. Management does not view this approach as overspending but as good business planning. These estimates add value to your recommendations, both short- and long-term, and they give decision makers the realistic information they need to plan and budget IT resources.

Although this is better than not having a fault-tolerant disk system at all, a hardware-based solution is preferable.

download 70-461 Latest Dumps Stanford

Some of the better RAID systems have some very impressive throughput with high capacity and built-in caching. Some even support hot swap disks that, in the event of a failure, allow you to replace the bad drive in your system with an off-the-shelf disk without ever having to power down your server. Not all servers justify this kind of expense, however. Use your best judgment and choose a disk subsystem with great care. If you must use the software-level implementation of RAID, I suggest starting with a mirrored disk configuration.

By choosing the mirror configuration, you will keep performance up as much as possible while maintaining some kind of tolerance. Example Configuration For Server B http: We will again be using a good PCI network card for maximum throughput and two disk drives. The first drive will be for installing the operating system and program files; the second will be for data. Due to costs, we are not going to go with a RAID system. Instead, I will use a 1.

I will use two separate controllers for the hard drives for maximum throughput of data. This system will be a great candidate for adding a mirrored disk down the road as funding becomes available for adding fault tolerance. Choose a good disk controller that supports multiple disks or a system that will support the addition of a disk and controller for establishing a mirror—or, in the case of adding the extra controller, for duplexing the data drive.

These machines, of course, do provide some incredible numbers when it comes to throughput and horsepower. Despite their high cost, multiple-processor machines are of great interest to the majority of students in my classes.

Many companies have downloadd expandable machines and are looking for answers on how to take advantage of this architecture. As we explore topics later in this book, I will describe how efficiently these monster machines can run your queries. I am focusing on Intel platforms because of their cost-to-performance advantages over other hardware platforms. In addition, RAID level 5 subsystems are usually the order of the day for these machines.

These servers usually have between 8 and 32GB http: Redundant power supplies and a replacement drive on the shelf are musts when these systems go online.

These high-end servers are not something that the average programmer should jump into. Great care should be taken to configure these machines to suit their final environment. I recommend having a consultant handle the initial hardware configuration for you if you have even the smallest doubt as to whether you can do it on your own. Also, I have seen many servers that run at a much lower speed than they should because of one incorrect setting or configuration option.

The most common question I get is: The topics covered in this section can help you solve many of the problems associated with installing Microsoft SQL Server without any training or guidance.

I have seen many newsgroup messages that could have been avoided had the reader followed these suggestions. This account is created with the User Manager for Domains. The name you choose is not as important as the permissions you give. I usually choose a name that makes sense to me.

This domain account should be granted the Log On As A Service right so that it can get to the server when it needs to. Do not place any account restrictions on this login. Select a password in accordance with good security practices and make sure that the checkboxes for User Must Change Password and Account Never Expires are set properly.

See Figure 2. Figure 2. Having the server run under the permissions of an account http: In the event that some unauthorized user gets access to your server, this extra security precaution can save you some headaches. When you create an account for the server, make sure you grant the same permissions as you did for the Executive service account.

This account can also be used for a mail client application. However, you should create the account before attempting to configure the SQL Mail Client or your server. In practically all mail systems, SQL Server must run under the same account as the mail client you install and configure. In the Services dialog box, click the Startup button. In the bottom of the User Properties dialog box, fill in the domain name, the account name, and the password, then select OK. Using email in your applications can provide you with a proactive management tool that many systems lack.

Be aware that each mail system is configured slightly differently; you should consult the section on installing the specific mail client in SQL Books Online for any issues associated with your mail system. Some additional overhead is involved with sending mail from inside your SQL code, so expect a slight delay in the execution of scripts or triggers that send mail.

Since the use of email in your code has little to do with installation, I address this topic later in the book. Mail does, however, play a huge part in setting up a proactive server that will alert you to potential problems before they get out of hand. Make sure that you create the accounts that mail must run under, and be sure to log in as those accounts when installing the mail client software. With Microsoft Exchange, for example, you need to set up a profile that matches the account that Microsoft SQL Server runs under as a service.

Failing to do this will cause configuration problems when attempting to get mail features to work properly. I will walk through setting up a Microsoft Exchange client later in this chapter.

Installing Microsoft SQL Server One Last Time I know that budgets and office politics come into play when deciding what kind of server to download or what upgrades are required to meet your needs.

Take what I have said here with a grain of salt. Few scenarios allow the best-possible design to be implemented. Fight only the battles you feel need to be fought. You can put an awful lot on a SQL server and make it do just about anything you can imagine. I have not found a bad system yet—only ones that are improperly configured. Regardless of the server you choose to install Microsoft SQL Server on, you will be tempted to add a service or two to this server.

Placing additional services on any of these servers will cause changes in the way you should configure Microsoft SQL Server. Most administrators will be tempted to place some extra files or network tasks on the higher-end machines I have talked about here.

Fight that urge! Do not forget to consider distributed processing in your plans. This application can be a valuable tool on your network. The more the better, up to the MB range.

Insufficient memory limits the number of users and lengthens query response times. Creating these accounts will allow you to implement better security on your servers and to easily configure email services.

These accounts should be granted the Log On As A Service right with a good password assigned for security purposes. It is very important to ensure that mail can be sent through your client software before assuming that the SQL mail client can send mail. Practical Guide to Installation This section presents step-by-step instructions for the installation http: To install the product, it is not required that this be done first, but I have found that most easy-to-miss service configuration problems can be avoided by creating the correct user accounts before installing any software.

All three of our example servers would use the same account information and would require the same information regardless of hardware.

Log on to Windows NT as Administrator. Select New User from the User menu. In the User Properties dialog box, input the information in the following steps see Figure 2.

Western Governors University

Make sure that you provide a secure password for this account. Select the User Cannot Change Password checkbox. Select the Password Never Expires checkbox. Click the Groups button. In the Group Membership dialog box, add this account to the Administrators group. Click OK to close this window. Click Add, then Close to complete the user account. This tool is similar to many popular manual code entry query tools.

I highly recommend this tool to anyone developing ODBC clients. The next icon displayed is only available with version 6. This is a RQBE tool that many of your users might be used to using. Microsoft Query uses ODBC to connect to the server and allows you to build queries graphically by clicking on the fields you want and selecting the order through its interface.

This is a good tool for testing any ODBC connection since the server is accessed in this manner. The readme file contains release notes for running some SQL files installed on http: This file also contains some Distributed Transaction Coordinator notes that you should read before configuring DTC services on your server.

This utility should be run any time you have connection problems with the server. Remember that the Enterprise Manager is also a client program, even though it runs on the same machine as the server itself. This is the first place I send people who call me and tell me they cannot connect to the server with a client or the Enterprise Manager.

The next yellow question mark is a Help file that explains the distributed management objects DMO at the root of this system. These objects can be used by client applications to perform many tasks without having to log on to the server as the administrator. I strongly recommend that client developers become familiar with the DMO structure outlined in this file.

Next is the Enterprise Manager. This is a client utility that can be run on the server or remotely to administer your entire enterprise. This tool does an excellent job of showing you the remote management abilities of your servers using the DMO. I will go into more detail on the Enterprise Manager as we progress through the book; for now, just note that this tool performs many tasks that you can add to your client applications for increased management of your server.

The SQL Performance Monitor icon represents a standard set of Windows NT counters saved in a file that you can open to test the performance of not only your SQL server, but the operating system and network as well.

This tool warrants a book unto itself; I will cover some of its uses in Chapter This tool is a huge timesaver and should not be overlooked when configuring users and their ability to connect to the server.

This resource will help you troubleshoot 85 percent of your problems without having to open a book well, except for maybe this one. I have found the tool easy to use and fairly powerful when coupled with stored procedures. If you need dynamic data access to the Web, you will still need to use the IDC or ADC features of your Web server for interactive database query capabilities.

This Web utility gives you good basic reproducible results with little effort. This is an easy way to stop and start SQL Server-related services. This allows you to change configuration options or network support through the setup program. Keep in mind that some changes you make in this program require you to restart the MSSQLServer service before they take effect.

Last is the SQL Trace utility. This tool is crucial to monitoring the use of your server and determining what raw SQL is being passed to your server by its clients. I will walk you through this utility in Chapter If you have other SQL Server-based applications to install on this server, you should check these items first before moving on to any other software.

This change will not take effect until you restart the service. This particular setting is considered to be static during the current session that Microsoft SQL Server runs under. Development Versus Production To access the contents, click the chapter and section titles. When management is faced with configuring a development environment, their latitude often depends on the size of the operation.

Larger MIS departments usually have the resources and backing to provide a development server that mirrors the production machine fairly closely.

Small- to mid-sized shops tend to have tighter budgets and fewer resources with which to accomplish this task. Microsoft SQL Server gives you the freedom to be creative when approaching the development process. In planning and budgeting the development environment for Microsoft SQL Server, you have many freedoms that some other systems do not allow.

You can place a test server into your environment at a very reasonable cost. By scaling your SQL server, you can simulate many production environments.

Microsoft SQL Server lends itself to scaling very well. Development Versus Production I have found that even on inexpensive machines, Microsoft SQL Server can play a very productive role in the development process. For some environments, I have recommended having multiple SQL servers in the development environment to lower the impact of the development process.

Setting Up The Development Environment For many projects, having many programmers simultaneously developing client applications to run against a server is common. In most cases, one or two database programmers can support the needs of even a mid-sized MIS department.

I have found that using a multiple-server development environment provides the greatest flexibility when creating stored procedures, triggers, and data modeling. One benefit of using multiple servers is the isolation your SQL code development can enjoy. Your database programmers can write and test code against their development server, then copy tested and debugged code out to the server being used by the client development team. I have used a lightweight machine similar to the example Server A in the previous chapter to facilitate just such an environment.

In the event you cannot use this concept, you might want to install Microsoft SQL Server on your own workstation. This type of development is a sharp contrast to the databases running in the UNIX environment.

You are no longer forced to develop on the same machine as everyone else. One important note is to take additional care to ensure the correct version of the objects you develop and deploy.

With the Enterprise Manager client provided with Microsoft SQL Server, you can effectively migrate or transfer your fully debugged objects to other servers rather painlessly. In a multiple-vendor product environment, having a single product to interface with your data servers can be a great benefit.

You can also reach Embarcadero on the Internet at www. The amount of thought and planning that has gone into these products is impressive. As with other tools of this type on the market, you will experience http: Development Versus Production a short learning curve. Throughout the book I will mention where the Embarcadero products have proven useful. The use and methods of third-party applications are not the focus of this book; I will use them in examples where possible, however, because tools like these do provide significant benefits.

Most of these third-party modeling and development tools do a very good job at what they were designed to do. Take some time to learn what each target design is before committing to a download. Previous Table of Contents Next Data Modeling When you begin the design process, good planning is essential. I have been contracted to fix many systems that were written by and for programmers and the user was the first on the list of considerations.

When faced with the task of actually implementing the methods, programmers myself included will always take the path of least resistance. This path will cause you to stray from your focus of providing the user with a working product.

The user is the focus of any project a developer undertakes. Users pay for our services with patience and feedback, as well as with money. The projects we develop are evaluated as to how well the user can perform the tasks we seek to automate with our creativity.

To keep myself on the right path when developing systems, I try to view the database objects as entities for as long as possible. I like to model either at the physical or conceptual layer a bit longer than most developers. I have found that by spending more time at these levels, less revision time is required down the road.

Use the data-modeling technique as a road map. Follow the design as closely as you can, but keep in mind it is only http: Development Versus Production a guide.

When making changes to your data structures, update your data model to reflect your modifications. Plan a few hours a month for keeping your data model as current as possible, whether it needs it or not. Many of the third-party modeling tools available support designing and modeling of your data.

Most have a base set of features that can aid you in staying current as long as possible. There will come a time when you lose the connection between your model and your data structure.

Many things can cause this separation. Typically, problems arise when you use a tool for data modeling that does not allow you to create all the objects that your design requires. I have used most of the major modeling applications on the market today and have found limitations to each one. Be creative; use scripts to create or modify the objects that you cannot create with the modeling tool. Spending the time keeping your model current is time well spent.

When choosing a data-modeling tool, research each tool thoroughly. Keep in mind that these tools are not cheap. Look for a tool that allows you to model at either the conceptual and physical layer, and that allows you to create views and triggers. Whether development or production, all systems have two levels of configuration that need attention: Server-Level Parameters Server-level parameters should be checked and verified when you install the server.

When creating a mirrored server to your production environment, you can use these server-level parameters to help tune or scale your server to react to queries in the same way. One important point to mention here is that some server-level parameters are dynamic and some are static. Dynamic parameters can be changed on the fly programmatically or through the Enterprise Manager.

Other parameters are static, which is to say that you must stop and start the MSSQLServer service in order for the changes to take effect. User Connections The User Connections parameter should be configured after installation to allow for the number of users on your system. You should set this value equal to the number of expected connections to your server. Keep in mind that one client may have more than one connection to the server at any given time.

When a multithreaded application connects to Microsoft http: In addition, the SQL Executive and the other management-type accounts on your server will use between five and seven connections. The maximum value you can set for this option is a theoretical limit of 32, This limit assumes you have the hardware and memory to support that kind of burden.

Each connection will take a small amount of memory away from the amount of memory that Microsoft SQL Server will use. In some of its documentation, Microsoft recommends approximately 37K per user for each user connection.

In other Microsoft documentation, however, I have seen 40K and 42K used as the magic numbers for calculating the amount of memory user connections will take. To be safe, I assume 42K per user connection.

By choosing the 42K amount, I am allowing a small bit of memory to spill over into the caches for each user. That way, the most you can be off is a net of 5K. Although that may not seem like much, using the 42K value does give you a cushion. There are some good reasons to place Tempdb in RAM, but in all cases, you should test the move to RAM thoroughly and make sure it really is providing an improvement.

Previous Table of Contents Next Sort Pages Next is Sort Pages, which specifies the maximum number of pages that will be allocated to sorting query output on a user-by-user basis. On machines that execute large sorts, increasing this number can improve performance. Resource Timeout Resource Timeout is used by Microsoft SQL Server to determine the number of seconds to wait for a resource to be released by another process.

The default setting for this parameter is Increase this value only if the SQL Server error log has a lot of logwrite or bufwait timeout warnings in it. Checking these logs each day can help you proactively troubleshoot events that occur on your server.

I have not found a system yet that you can set up and forget. Read-Ahead Optimization This subject will be covered in more detail later in Chapter Most of the parameters in the RA section should be changed only if you are instructed to do so by a qualified support technician. Only two of the parameters are used for tuning the RA management for your server with any regularity: RA worker threads and RA slots per thread.

Development Versus Production RA Worker Threads The number of read-ahead worker threads you specify can impact the performance of your server. Microsoft recommends that this option be set to the maximum number of concurrent users on the server.

The slots-per-thread option controls the number of simultaneous requests each read-ahead service thread will manage. The total number of worker threads multiplied by the number of slots equals the number of concurrent read-ahead scans that Microsoft SQL Server will support.

The default value of 5 should be fine. However, if your server has a very good disk subsystem, you might be able to increase the number of scans that a single thread can handle by adding to the default in small increments.

As with any change you make to your server, you should change options slowly and carefully. Write down the old value, then change one value at a time and test the impact on performance. Test and benchmark your changes and verify that they, in fact, did what you expected. Only change this option on a machine dedicated to SQL server, or you might find yourself being unable to launch other applications or tasks on your server.

Be leery of this setting when you have a dual-processor machine as well. Instead of the database having to create and manage threads internally, the operating system shares threads with SQL Server.

Other systems that are not tied to the operating system in this way must maintain their threads at the application level, thus slowing the application down even a small amount by adding additional overhead. Development Versus Production By changing this option, you control the number of threads allocated to the user pool.

When the number of user connections is less than the worker threads setting, one thread handles each connection. However, if the number of connections surpasses the Max Worker Threads value, thread pooling occurs.

You might think that pooling threads would slow down the operation of your server and that this value should be set high. However, the default of is too high for most systems. Independent third-party studies have discovered that setting this value to around actually allows your server to operate much more efficiently.

Lock Escalation Parameters Lock Escalation parameters is one of the hottest topics being discussed in newsgroups on the Internet today and probably for a long time to come as well. This set of options can lead you down the path of eternal server adjustments if you do not look at all the reasons you are having locking issues on your server in the first place. It is easy to assume that you need to modify one of these parameters to thwart deadlocks and hung processes.

In some situations, the source of your problem is in fact a misconfigured locking scheme, but in reality, most locking problems are the result of questionable query methods and database design. The Lock Escalation Threshold Maximum is intended as an upper boundary for the server to use. The server will hold this maximum value of actual 2K page locks per statement executed before attempting to escalate the lock to a table lock.

The lower boundary the Lock Escalation Threshold Minimum is used to help keep SQL Server from locking a small table with only a few rows in it every time a query is run against it. The minimum is used in conjunction with the Lock Escalation Threshold Percentage to control this situation and keep locks from getting out of hand. All of these options can be overridden on a statement-by-statement basis with optimizer hints. Including hints in your SQL statements will cause these thresholds to be ignored.

Great care should be taken when overriding the Query Optimizer. The use of optimizer hints is covered in Chapters 5 and 6. Previous Table of Contents Next Fill Factor Fill Factor is a parameter that you should learn about almost as early as opening the shrinkwrap on your software.

Changing this value affects how indexes will be rebuilt, as well as how the space on your server is used. See Figure 3. Both examples show the same index, yet the bottom set of 2K pages actually requires more disk space, due to the limit of filling the page only halfway. Figure 3. The default for SQL Server is 0, or percent.

I usually change this to 80 or 90 percent, depending on the type of server that I am configuring. When you change this server option to a lesser value, the effect on your existing data is not apparent until you rebuild your indexes.

When an index is created on your SQL server, the fill factor is used to leave space available on pages for your index to grow into. This growth is what the fill factor allows to happen without fragmenting your index and slowing response down.

When a page is filled, any other data that should be placed on that page has no place to go. Therefore, the data must be split and placed in other pages or even other extents. Extents are eight contiguous 2K pages stored in your database. Development Versus Production spending extra time moving the disk head for read operations or write operations. I have seen some messages on the Internet that recommend using a 50 percent fill factor or even less, to help keep performance up on a system.

This is fine if you have a lot of disk real estate and are not concerned with the amount of space you are losing to indexes.

I recommend keeping this value as high as possible and using specific indexes with lower-than-normal fill factors to lessen the impact on your disk space.

The only time I use a fill factor of percent is for read-only data or for data that will be modified or changed so rarely that the impact on performance is not an issue.

The fill factor is only checked when you build or rebuild an index. When you add data to an index, the fill factor is ignored.

These are the main server parameters that you should concern yourself with at this point in the process. You should become aware of each parameter and its function on your server. The more you know about these parameters, the more you can understand why things happen the way they do on your server. Application Parameters Application-specific parameters represent the second level of configuration.

Microsoft SQL Server does not allow you to configure global client variables; therefore, you should develop a table-driven approach to application parameters. This option allows you to place the table containing your parameters into the data cache and keep it from being flushed, thereby increasing performance of queries against this table. Once the table is pinned, changes to the data are logged, and the table can be recovered in the event of a media failure.

Use this feature with caution. If you pin a large table, it will consume your available data cache and impact the performance of queries against all your other tables. When storing your application variables on your SQL server, beware of creating a table that is high maintenance and low return-on-investment.

Create only the indexes needed and do not create a huge, wide, and hard-to-understand table structure. Registry-Type Tables There are two good methods for using server-based client application http: Development Versus Production parameters. First is the creation of a registry-type table. This table allows you to store a broad range of data in a very flexible format.

Beware, however, of registry pitfalls. You have seen what can happen to a machine running Windows 95 or Windows NT when applications write to the registry in a haphazard way—disasters and the dreaded blue screen of death. Clean up after yourself and pay attention to keeping the data and keys clean and up-to-date. Remove unused keys and perform maintenance on this table regularly.

Store only single-value data in this type of structure. A registry-type table does not lend itself to relational or set-based data queries. Using a list-type table structure to return result sets to your client application is much more efficient. List-Type Tables A list-type table can be joined with other tables or used to return rows to a client application. One possible use of this kind of information might be a current user table. This table may store such information as the key values of the records the user is accessing and permission variables on those objects.

By storing this information in a table, users can very easily be guided back to where they were last working, making the application appear to be smart and intuitive. Indexes play an even greater role in performance with this kind of table than with a normal data structure. Beware of multiple indexes on this kind of table. Look at the queries that run against it and determine the minimum index configuration required.

Updates are usually more frequent against this kind of table, and therefore the index values will need to be updated often. Most large systems require you to run some sort of configuration scripts to prepare the system for operation. Very few good data-modeling tools are available that let you create a finished structure that takes into account revisions of objects and changes to permission structures.

These types of issues are easily handled with scripts. For example, scripts can be used to add the user-defined error messages for your application to the SQL server. Although these error messages can be configured by hand on a case-by-case basis, doing so lends itself to human error and inaccurately typed messages.

Creating and executing a script file is much faster and allows you to configure two or more servers with exactly the same options. In fact, most data modeling tools allow you to create script files to run against your server for object creation. When writing scripts, you should keep a few ground rules in mind: A script is broken down into batches.

These batches are designated by the word GO. If an error occurs in a batch, that batch will not execute. Other batches in the script will still run, however. You cannot create and reference an object in the same batch. You also cannot drop and create an object with the same name in the same batch. SET statements take effect at the end of a batch.

Recommended

Be sure to use a lot of comments in your scripts. Although I will not go into writing SQL statements in detail until Chapter 5, these tips should be committed to memory. The above list will help you troubleshoot performance problems when executing scripts.

Notice that I have commented extensively in this script, even though it will be run only once in a while and should not change often. This allows other programmers to read and understand quickly what I was attempting to do with the script. Notice that the formatting shown here is for legibility only. Each of the EXEC lines is continued on the next line in this text as indicated by indention but should be on the same line in your script.

Download Microsoft SQL Server Black Book

Development Versus Production text file. Listing 3. Server Message Init File Name: Drop any existing messages in range to meet my needs. This will be logged in the Windows NT Eventlog. This will be written to the Windows NT Eventlog. To use this sample script, you open the text file with any of the available text-based query tools, then execute it.

The following is the sample output from this script when executed: Replacing message. New message added. Development Versus Production Replacing message. This sample is intended only to illustrate what a script looks like. I will explain in later chapters how to write scripts to perform different tasks on your server. Scripts are a very powerful tool that can be used to save a lot of time and ensure consistency among servers.

Although Enterprise Manager cannot script user-defined error messages, it can generate scripts for just about anything else. You can select pre-existing objects in the Enterprise Manager and have the script created that would be necessary to drop and re-create the object on your server or any other SQL server.

This can be very beneficial when you need to modify an existing object quickly.

You might also like: HARLEYS LITTLE BLACK BOOK

This is a very useful feature in a development environment. You have the ability to generate scripts for groups of objects, as well as for specific objects.

Notice the Object Creation and Object Drop checkboxes. These options allow you to drop and re-create objects from the script or scripts you generate. Remember that the Enterprise Manager will generate scripts that http: Development Versus Production will drop and re-create objects, but it will not generate a script to migrate data from one structure to another.

The only option available with the Enterprise Manager is the Transfer utility, accessed by right-clicking on a database and selecting the transfer item. I will go through this process in detail later in this chapter. DBArtisan is designed to manage multiple servers of various manufacturers from a single utility. This can be a great advantage over the Enterprise Manager. Some options listed and displayed may not be available for Microsoft SQL Server, so take the time to learn what can and cannot be accomplished with each tool.

Rapid SQL, also from Embarcadero, provides a fast scripting option that allows you to use objects in a list as a base for building your scripts. Point-and-click graphic interfaces allow you to quickly select the objects you wish to modify or to create custom scripts with very little effort.

Note the formatted and easy-to-read text. Transferring Objects One of the most common tasks in developing on a separate machine is the transfer of your created objects and any supporting data to your production environment. In a multiple development-server environment, this transfer http: Development Versus Production process becomes even more significant. Ensuring that the correct version of each object is in place can be a daunting task.

Confusion can surround the process of actually putting your changes in place on another server. In a development environment, you typically have the luxury of making backups of your data and applying your changes without the pressures of the production machine. However, there will come a time when your work must be migrated to another server as quickly and painlessly as possible.

This tool is much less error-prone in version 6. One drawback is that the transfer utility does not allow you to remap old structures to new. The Transfer Manager is a handy tool to move new objects to a production machine.

It does, however, fall on hard times when you must change an existing structure that contains data. To the credit of Microsoft, the Transfer Manager does a wonderful job of copying a complete structure, including data, from one database to another empty container. It allows you to copy an existing production database to a development server with one easy step. Including the data in the transfer allows you to take a snapshot of your production server and place it on another server quickly and easily.

Keep in mind, however, that not all development servers have sufficient disk space to hold a complete database transfer; you might have to get creative and transfer everything but the data, then use scripts to grab smaller subsets of the larger tables. When you are migrating existing data, it is a good practice to rename your target tables. There is a stored procedure in SQL Server 6.

Once you have created the new table, you can remap the columns by using a script to migrate and convert data types and allow for new columns.

Previous Table of Contents Next Transferring Data The actual process of transferring data can be accomplished in many ways. I find that a few preliminary steps can make the process smooth and reduce frustration: Plan your migration and test it locally first. Create another container that is not under the load of production and test your transfer to ensure you have not missed anything. Test run the data or schema transfer to uncover any potential problems with your process.

There are three main methods that I use to allow for the transfer or migration of objects and data from one container to another: You have control over the column data types and remapping of existing data through a script that can be run whenever you need to move your data.

DBArtisan has a table edit utility that is easy to use. DBArtisan actually lets you modify the structure of an existing table with data and create a single script that will rename the existing table, create the new structure, map the old data to the new table, and re-create any dependent objects that rely on the target table.

I have talked with many people who wish there were a graphic interface for this utility; by the time this book is printed, I am certain someone will have written http: Development Versus Production one. I typically use a batch file that I can type into and run repeatedly for testing purposes. The entire command goes on a single line, with a versatile list of options.

See Table 3. Direction of the Copy operation. The name, including full path, of the source or target file. Default is first row. Default is last row. Default is all rows in one batch. Microsoft SQL Server native file format. Designates that there are identity columns in the file that should override the identity values currently in the table.Look for a tool that allows you to model at either the conceptual and physical layer, and that allows you to create views and triggers.

She has even put up with my sense of humor and comments throughout the project. Scripts are a very powerful tool that can be used to save a lot of time and ensure consistency among servers. This method of security keeps unwanted users from browsing where they should not while allowing others to do their jobs. To correct that situation, you must flush the BLOB cache.

King I did it. Each chapter is broken into two sections.

LAURENE from Concord
I enjoy reading books defiantly . Look through my other articles. I am highly influenced by javelin throw.
>