By Bradley Schacht
Click here to view Brad's entire blog.
SharePoint uses a series of SQL Server databases to save configuration information, security setups and all the user data that has been created. The Central Administration site is no different and has its own content database that is created when running the SharePoint configuration wizard. Unfortunately, unlike when creating service applications, you do not get to choose the name of this database during the wizard. The database name will default to SharePoint_AdminContent_<GUID>.
This could be problematic for a number of reasons. Many companies have a set of standards for database names. Other times you will want to change it simply because you have a small OCD issue (that’s me). While the task is not necessarily as easy as going into SQL Server and renaming the database in the object explorer it isn’t necessarily difficult either. Just follow the steps below and you will clear up that OCD (or naming convention) issue in no time.
New-SPContentDatabase –Name SharePointCentralAdmin –WebApplication http://kerberos-sp:36000
Get-SPWebApplication –Identity http://kerberos-sp:36000 | Get-SPContentDatabase | SELECT ID, Name, WebApplication | Format-List
Get-SPSite –ContentDatabase c87506a9-b87d-40b8-9582-aac9ee89c8f8 | Move-SPSite –DestinationDatabase f79cb9d8-8e45-4405-82c9-081f58bce7a0
By: Devin Knight
Click here to view Devin's blog.
In a previous post I wrote about Using a Hyper-V VHD in an Azure VM. Today I’d like to show you what my next steps were. My goal with using Azure is to create an lab environment for student that fits the following needs:
The point of this post is to guide you through how to easily create virtual machines for multiple students or employees in a quick and easy script. By the way, I have previously posted how to solve the problem of powering up and powering down the virtual lab environments in my blog about Setting up Automation in Azure.
These steps will guide you through how to take what you learned in my blog aboutUsing a Hyper-V VHD in an Azure VM and create and image from your VHD you uploaded. You would also create an image from a Virtual Machine that you created from the Azure Gallery. Once an image is created you can then spin off as many virtual machines as you would like from that image.
The the image created you can now either manually create new virtual machines from the image or use PowerShell to scale your solution better. For me Powershell made most sense because I’m not trying to build just one virtual machine. In my case I actually need to build 15 identical virtual machines (for 15 students) based off the same image.
param([Int32]$vmcount = 3)# Create Azure VMs for Class
# run in Powershell window by typing .\CreateVMs.ps1 -vmcount 3
$startnumber = 1
$vmName = “VirtualMachineName”
$password = “pass@word01″
$adminUsername = “Student”
$cloudSvcName = “CloudServiceName”
$image = “ImageName”
$size = “Large”
$vms = @()
for($i = $startnumber ; $i -le $vmcount; $i++)
$vmn = $vmName + $i
New-AzureVMConfig -Name $vmn -InstanceSize $size -ImageName $image |
Add-AzureEndpoint -Protocol tcp -LocalPort 3389 -PublicPort 3389 -Name “RemoteDesktop” |
Add-AzureProvisioningConfig -Windows -AdminUsername $adminUsername -Password $password |
New-AzureVM -ServiceName $cloudSvcName
.\CreateVMs.ps1 -vmcount 3
Once your virtual machines are created your next step is to consider if you want to keep them running all the time or only during business hours. Keep in mind you only pay for the time your virtual machines are started. For my scenario I wanted to turn my machines off during non business hours and back on in the morning, which can be automated through Azure Automation. Read this post on Setting up Azure Automation.
Click here to view Devin's entire blog.
Instructor: Kathi Kellenberger
Click here to watch the full webinar.
Click here to view Kathi's blog.
Q: Can we develop SSIS packages using SSMS or do we have to use SSDT?
A: You must use SSDT. That makes sense; it is a development tool!
Q: Any impact when upgrading from 2008 SSIS to SSIS 2014?
A: Except for just trying it out, I haven’t had the chance to do any real upgrades. It’s important to figure out if you are going to move to the new Project Deployment model and do the planning. Once I get some more experience with this, I’ll do another webinar just about it.
Q: Can SSIS 2012 sense the environment name and set the environment accordingly?
A: I’m not sure exactly what you are getting at here. In the new project deployment model, you assign the environment.
Q: Can I run a package developed in SSIS 2012 or 2014 in a 2008 R2 environment?
A: I haven’t tried this and a quick search didn’t turn up anything helpful. My gut is telling me that it should work as long as you are using the package deployment model.
Q. Do package configurations still exist in SSIS 2012?
A. Yes, you have the choice of using the package deployment model which gives you the traditional configurations or using the new project deployment model.
Q. Is there a way to make changes to one package without redeploying the entire project?
A. No, in the project deployment model, you must deploy the entire project even if there is just a change to one package.
Presenter: Anthony Martin
SQL Server Integration Services is a versatile tool that makes it easy to move and transform data. Sometimes unusual situations present themselves that leave developers scratching their heads. In this webinar, Anthony will demonstrate how to handle some of these oddball challenges.
Click here to view Anthony Martin's blog.
Q: Could the file cache be used between packages in the same projects?
Yes, the ‘Use file cache’ option of the Cache Transform transformation can be used between packages that are in the same project or even different projects.
In order to do this, you’ll need to configure the Cache Transform with the ‘Use file cache’ option and execute the package. This is because the Cache Transform file (.caw extension) is created until the package is executed. Next, in another package that you want to reuse the Cache Transform file, add a Lookup transformation to a Data Flow Task. While configuring the Lookup transformation, select the ‘Cache Connection Manager’ connection type on the ‘General’ page. On the ‘Connection’ page, click ‘New’ to create a new Cache Connection Manager, select the ‘Use File Cache’ option and browse to the location where the file was created in the previous package. Configure the rest of the Lookup transformation just like you normally would and you are all set!
Q: Are Lookup Transformations better than just using Stored Procedures in the Source component?
If the connection for the source and the lookup are on the same database and correct indexing exists then a stored procedure in the source component that performs the join for you is probably the appropriate choice. However, if the source and the lookup connections are on different database servers or different file types altogether then the lookup transformation is one of only a few options available and is a highly efficient transformation if configured using the default settings.
Q: Could you please share the SSIS project and SSMS solution?
I’ve posted both solutions to my OneDrive. You can access them here.
View Demo Here!
Pragmatic Works has earned top-ranking as one of Jacksonville Business Journal’s 2014 Best Places to Work. Only companies qualifying by both employee participation in an anonymous survey and scoring in the top percentile are being recognized. Winning companies represent a wide spectrum of industries and the very best in the Northeast Florida business community.
“Pragmatic Works is proud to be named one of Jacksonville Business Journal’s 2014 Best Places to Work,” says Brian Knight, CEO of Pragmatic Works. “We strive to provide the best possible work environment for our employees and would not be a leader in our industry without them. Company culture is extremely important to us, and is something we go out of our way to emphasize and promote principles such as integrity,creativity and giving back to the community.”
Pragmatic Works offers its employees many benefits including:
· Employees are made partners in the company after one year of employment and get annual payouts from overall company profits based on years of service
· Annual holiday bonuses based on years of service
· Discounted membership to a local gym
· Multiple yearly parties to foster camaraderie and as a reward for hard work
· 401K Matching
· Ample Vacation& Sick Time
“I came on board with Pragmatic Works when it was still a very young company and I have been with them for overfive years. The great thing about Pragmatic Works is that they have kept their core values and fantastic company culture intact despite doubling their sizeyear after year,” says Shawn Harrison, Senior Sales Engineer for Pragmatic Works. “It is still as fun and interesting a job now as when I first started,and my passion for working here has only grown.”
Pragmatic Works will featured as a Best Company to Work in the Jacksonville Business Journal in their July 4th,2014 issue.
“The Clay Chamber is honored to recognize Pragmatic Works as our Small Business of the Month. They are a shining example of a small business putting together a great product and business plan, executing it flawlessly, reaching for the stars and landing on the stars, states Doug Conkey, President of the Clay County Chamber. “Their reach is truly global and in turn has allowed them to help businesses reach their goals. This success also allows Pragmatic Works to help others in need of training through its foundation. The Clay County business community is truly inspired by the example set by Pragmatic Works.”
Pragmatic Works is very active in the community and sponsors events such as Concert on the Green and has provided panelists for local discussions on marketing and social media. In addition Pragmatic Works runs a foundation program that is dedicated to helping motivated passionate individuals change their lives by learning in-demand technology skills. Pragmatic Works also hosts a kid’s technology camp, where children are introduced to computer programming and robotics in a fun and educational environment.
“Pragmatic Works is honored to be named Clay Chamber’s June 2014 Small Business of the Month,” says Brian Knight, CEO of Pragmatic Works. “We are proud to be leaders in the small business community and to be able to give back to local residents. We have been fortunate enough to double our organization’s size consistently for the past several years, and we look forward to continuing this progression in this incredible community.”
In May of 2014, Pragmatic Works announced that they partnered
with Arrow Software to procure their own Analytics Platform System (APS). This
makes Pragmatic Works capable of doing Proof of Concepts, Architectural Design
Sessions and hands on demos for any clients interested in upgrading their
current data warehouse environment.
Pragmatic Works is the market leader in APS (formerly known as
Parallel Data Warehouse) experience. Their team has been expertly trained with
the Microsoft Center of Excellence and has worked alongside Microsoft leaders
in delivering APSs. They are a National Systems Integrator (NSI) Partner, making
them one of 35 top partners in the nationwide across all platforms. Pragmatic
Works’ APS appliance also includes HDinsight and polybase.
“By being able to conduct our own POCs and ADSs, Pragmatic
Works is setting ourselves apart from other partners,” says Adam Jorgensen,
President of Pragmatic Works Consulting. “With customers facing incredible
increases in data volume, the need is more critical than ever for massively
parallel processing (MPP) architecture to gain scalable performance,
flexibility, and hardware choices with the most comprehensive data warehouse
For more information about Pragmatic Works’ APS services,
please visit our website http://pragmaticworks.com/Services/AnalyticsPlatformSystem.aspx.
Presenter: Jason Strate
There are many "best practices" around that help people decide how to index databases. Having these practices can help alleviate the time that it takes to design an indexing strategy for a database. These practices can be of great use, except when they are wrong. Join us in this session, as we discuss some common myths associated with indexes and then dive into the myths to demonstrate how they can be debunked. At the end of the session, you'll know a few more things about indexes and leave armed with scripts that can help you debunk these myths on your own.
Click here to view the full webinar!
Q: Does indexing even matter now with Hekaton in place?
Absolutely. In-memory OLTP (hekaton) only satisfies some use cases to data needs. Its great for high throughput OLTP tables. That doesn't necessarily make it a good fit for reference tables, slowly changing OLTP, tables requiring LOB data types, or nearly all data warehouse solutions.
Q: How do you recommend finding the indexes that can use or benefit from tuning fill factor?
I look at sys.dm_db_index_operational_stats and sys.dm_db_index_physical_stats. The first dynamic management object provides information on the number of allocations; which relates to the rate in which page splits are occurring. Then look at the second dynamic management object to determine the indexes that are fragmented. After identification, then look at the rate in which these indexes need to be rebuilt and adjust the fill factor on those that are rebuilt most often.
Q: Do you think it is better to have many single column non-clustered indexes or a fewer number of multi-column non-clustered indexes?
It's not so something that a single pattern can fit all tables and workloads. In a general sense, non-clustered indexes with multiple or included columns can often provide a better performance lift over a large number of single column indexes. Of course, YMMV.
Q: We have seen some indexes getting fragmented few minutes after we rebuild/reorganize them. What could be the reason?
The main reason would be that the rows in the table are variable length and volatile in those lengths. After the rebuilds and reorganizations, the rows are being modified and page splits are occurring. A secondary reason could be an database shrink, it that is happening, just stop doing that immediately. There are other reasons, but these are where I'd start looking.
Q: Would an index rebuild, physically order the primary clustered index?
The index rebuild will resort the pages physically in the database, it will not physically sort the data on the pages. This shouldn't be considered an issue. Getting to the data is the primary concern and once the page is accessed, there it is.
In the past, developing a database was a simple affair where you built and deployed the database for a server down the hall in your datacenter, or in some cases to a server under someone's desk. While those good ole days were a much more simple time, they were quite limiting in how a database could be deployed and there were much fewer options for ensuring stability and performance. Today, SQL Developers have many more options for deployment including the services provided through Windows Azure. In this session, we'll dive into what you need to know to start developing databases with Windows Azure and discuss the value that cloud delivery can provide to your database development.
Q: Can you also create that BACPAC from within Visual Studio?
Yes, provided I am understanding the questions correctly. You can use SQL Server Data Tools, within VS, to create BACPACs
Q: If you want to learn SQL Azure, how does the pricing work? I was a little confused with how charges are incurred, I assume that it's not by each query you execute, etc..
For SQL Database, pricing is based on the size of the database and the network throughput that is incurred. The compute per query isn't calculated, just the throughput of data after. You can find more information here - http://azure.microsoft.com/en-us/pricing/details/sql-database/
As far as getting started goes, if you have an MSDN account, you can actually get lower that then published rates.
Q: How do you troubleshoot problems on the azure db, like querying the DMVs and Extended events, where do you save the file or what other option is there?
Tracing similar to on-premise databases is not available yet. There are DMVs though that can provide a lot of what is needed.
Q: I have SS 2012 on my machine, when i Create a project in Data Tool I am just seeing SQL Server 2005 & SQL Server 2008 under Project Version
You have to download SQL Server Data Tools seperately from the Visual Studio and SQL Server installs. The tools can be found here - http://msdn.microsoft.com/en-us/data/tools.aspx
Q: Wouldn't you have to migrate from Cloud 2012 to Cloud 2014 to XXX?
No, changes are introduced across the SQL Database SaaS platform on a regular basis. The idea is that as you own and use the service, it is continuously improving.