Top 6 Essential Data Architecture Insights eBook
Whether you’re working with relational data, schema-less (NoSQL) data, or model metadata, you face challenges in providing the right access and context for your enterprise data assets. How do you effectively integrate data from multiple sources and make it useful and usable across your organization?
You need a data architecture that can actively leverage information assets for business value. The most valuable data has high quality, business context, and visibility across the organization. Check out this must-read eBook for essential insights on important data architecture topics, including:
Finding the top offending statements for a specific workload can be hard if you are not using the best tool in your toolbox. On SQL Server 2012 and going forward you are able to leverage extended events which are included inside of SQL Server Manager Studio (SSMS) to build your first extended event to capture your workload.
Read this article and watch the short video to see how you can use the new Extended Event tools to find your top offenders.
- Who/When dropped or altered an object in my database?
- What is the growth pattern of my database?
- Who/When/How configuration settings were changed?
You may also be interested in Make SQL Queries Run Faster, an on demand webinar by SQL expert John Sterrett. You will learn some T-SQL performance tuning tips from a DBA who has a developer background.
There is a growing need to integrate diverse databases with each other and with new data sources. This is accomplished through integration of data models and their associated metadata across the workflows associated with database design, development and integration. Data models that are grounded in meaningful business context can ensure ease of database development and data sharing.
In this blog post, you can read an excerpt of the IDC Technology Spotlight paper, Enabling Enterprise Agility Through Model-Driven Data Design. This analyst report examines the growing importance of model-driven processes for database design and management. The paper specifically focuses on:
- The role of business-driven metadata in database modeling, design, and development tools
- Customer challenges for managing enterprise data and options to address them
- How Embarcadero’s ER/Studio solution addresses the needs of today’s enterprise data landscapes
Experience the future of data modeling with ER/Studio
Only a professional data architecture tool can provide the comprehensive modeling and metadata environment that organizations need for complex and dynamic data landscapes. More than 10,000 companies have chosen todeploy ER/Studio to provide value to their data. Selection criteria was based on various factors including ease of use, ease of migration, support for big data platforms, metadata collaboration, and excellent technical support.
Being a DBA is one of the most challenging job. Why do you think we are saying so? Think of a normal situation, I have seen developers say this is the easiest life if you are a DBA. They never do know what all a DBA does in his day job. Most people think taking a backup is the only thing that a DBA does in his job. However, most don’t know that there is so much to this job that never is seen by many.
One of the tasks involved in being a smart DBA includes automating the tasks in hand. What are the activities that a DBA must automate? Though there are a number of tasks to automate, here are few that we consider are worth automating as a DBA:
Backup is one of the most mundane task which a DBA needs to perform. From manual backup technique to building automation to it is something most of the DBAs perform. This include using the maintenance plans and scheduling these plans using SQL Server Agent. This is the most common method we have seen people use. These techniques are valid for using standard SQL Server native backups, but if you are using third party backups then most of these tools also expose scheduling.
Monitoring SQL Server
We have seen single DBAs get tasked with monitoring many servers at any given point in time. When working with a few servers it is important that we know the health of each of the servers from a single point of view. It is humanly impossible to be sitting in one place and monitoring each and every server to if:
Jobs completed successfully.
If the capacity is running out in case of space, CPU, Memory, Network
If the server is slowing down and not performing to the mark
If there are errors being generated on the server
If DOS attacks are happening on the server
If the physical hardware has failed
If the connectivity is malfunctioning
And many more. There is no fixed rules when it comes to monitoring SQL Server. Each DBA needs to reinvent his methodology to do each of these tasks.
Though patch management is one of the simplest task. It cannot be automated as-is but it needs to be documented and put as part of process. Each patch needs to be downloaded locally first. Then the application dependencies needs to be evaluated and tested before they are applied on production. When these are applied on production we need to take care of the order in which these need to be applied, if a downtime is required when working with thee patch, the order in which the patch will be applied in the case of clustered environments needs to be evaluated.
The final fine print of an effective monitoring is to have a robust alerting system. Think of situations where the server is experiencing severe lack of space to expand at the DB service is stalling. Or if some hardware component malfunctions you want to get an intimation via email or SMS in an automated way rather than the end user complaining and calling you in the middle of the night.
Automation is not a luxury for DBA, but it is a way in which they execute their job. Being efficient, smart in doing the tasks in hand is what makes a DBA so cool. Never does a developer know what all work a typical DBA might do. Automation can make organizations build process that can act as proactive measures to effective maintenance of their environments. Do let us know what all processes are automated in your environment that you found in the recent past.
Here are 10 quick SQL Server tips and tricks for DBAs compiled by SQL Server expert Pinal Dave.
- RAISERROR in the format RAISERROR integer 'string' is discontinued from SQL Serevr 2012.
- SQL Server should be behind the firewall not exposed to Internet.
- We can check the network packet size, by querying the system catalog view sys.confiurations.
- Audited events can be written to the audit log by using the new sp_audit_write procedure.
- SQL Server error logs can reveal a great deal of information about your server.
- Use "allow only encrypted connections" only if needed for end-to-end encryption of sensitive sessions.
- Enable only the optional features that you will immediately use.
- The tcp/ip packet consists of a header that is at least 20 bytes in size.
- The header of the ping packet is 28 bytes plus the size of the buffr you specify.
- In the event of a system crash, indirect checkpoints provide potentially faster, more predictable recovery time.
Try the premier cross-platform database administration solution today
Embarcadero® DBArtisan® is the premiere database administration toolset helping DBAs maximize availability, performance and security across multiple DBMSs. This essential toolset consistently boosts productivity, streamlines routine tasks, and reduces errors.
Join Embarcadero's Rob Loranger and Ron Huizenga at this upcoming webinar - From Big Data to Total Data, Let the Migration Begin.
Here are 10 quick SQL Server tips and tricks for database developers compiled by SQL Server expert Pinal Dave.
- For inefficient query plans: Check for issues with bad cardinality estimates.
- The maximum degree of parallelism can be limited server-wide by using the max degree of parallelism option.
- DMV - sys.dm_os_nodes provides information about CPU node configuration for SQL Server.
- Slow query can be because of Missing indexes can force table scans and slow down the query.
- Plan guides can be created for ad hoc queries as well as queries inside a stored procedure.
- On both 32-bit and 64-bit platforms, memory that is allocated through the AWE mechanism cannot be paged out.
- Entities with the same query_hash value have a high probability of referring to the same query text.
- The query hash is computed from the tree structure produced during compilation.
- A change in the cardinality of a table variable does not cause a recompilation.
- You can use Performance Monitor and SQL Server Profiler to detect excessive compilation and recompilation.
Take your SQL Development to the Next Level
Embarcadero® Rapid SQL® is the intelligent SQL IDE empowering database developers and DBAs the ability to create high-performing SQL code on all major databases from a single interface. This toolset simplifies SQL scripting, query building, object management, debugging and version control with intuitive, innovative tools.
|DATA MODELING NEWSFLASH|
IDC Technology Spotlight: Enabling Enterprise Agility Through Model-Driven Data Design
There is a growing need to build and integrate databases with each other and with new data sources. This is accomplished through integration of data models and their associated metadata across the workflows associated with database design, development and integration. Such data can be mapped to an enterprise business glossary, allowing business users to find the data they need, and technologists to determine the business value of each application function point based on the data it manages.
Data models that are grounded in meaningful business context can ensure ease of database development and data sharing. This analyst report examines the growing importance of model-driven processes for database design and management. The paper specifically focuses on: