Subscribe to Steve's Blog via Email
TagsAceOleDB.dll Administration Analysis Services Azure Backups cluster Cube Data Import drop user Duval County Errors Excel Florida Full-Text index Import LInes of Code; Linked Server MDX Monitoring packages Perfmon Performance Performance Counters Performance Tuning PowerBI projects SharePoint SQL Sql 2000 SQL 2005 SQL 2008 SQL 2008 SSMS SQL 2012 Sql Alerts Sql Cluster SQL Job history SQL Schema SQL Server 2005 SQL Server Agent SSAS SSAS Logging SSIS SSIS packages SSMS SSRS
- June 2016 (1)
- May 2016 (1)
- April 2016 (1)
- February 2016 (2)
- January 2016 (1)
- December 2015 (1)
- April 2015 (1)
- June 2014 (1)
- April 2014 (2)
- August 2013 (1)
- July 2013 (2)
- April 2013 (1)
- March 2013 (1)
- February 2013 (2)
- January 2013 (2)
- July 2012 (1)
- June 2012 (2)
- May 2012 (3)
- April 2012 (1)
- March 2012 (2)
- February 2012 (2)
- January 2012 (1)
- December 2011 (3)
- November 2011 (1)
- October 2011 (2)
- September 2011 (2)
- August 2011 (5)
- July 2011 (7)
- April 2011 (2)
- February 2010 (2)
- October 2009 (1)
The solution requires putting a “Schema.ini” file in the same directory as the file you are attempting to upload, and the file HAS TO BE NAMED “schema.ini” (case insensitive). The contents of the simple text file should be as follows:
Recently, I needed to find correct addresses for physical locations that I had only latitude and longitude. The general process to do this follows:
script to PROVE OUT the process for generating good addresses from JEA bad addresses thru the provided lat/long
(assumes, hopes the lat/long are good)
1. TMP table: Create a table of the addresses, lat and long
2. XY PROJECTION: create CTE using function on each row of tmp table to fill in XY projection
3. XY PROJECTION–>URL: create CTE with URL w/XY
4. CALL FOR RE: Using prior CTE… create CTE using function on each row to fill in new (correct) RE Continue reading
Hey, Everyone… our annual get-together of SQL professionals is happening on May 7th, at UNF, here in Jacksonville. I will be speaking at 9 AM on breaking into the world of Azure & PowerBI, using (mostly) the skills you likely already have in SSMS. From 9 to 5 PM it is one of your very best opportunities to learn a lot of new things, develop professionally, meet new people, enjoy a free lunch and…have a great time!
Now I am developing a PowerBI demo. Here’s my first look:
This is based on Duval County Florida real estate parcels.
So, sometimes you need a simple way to run (often the same) command on two different servers at the same time. Perhaps setting up a Linkserver is difficult, or simply too time consuming. Here’s an option. Try SQLCMD. Continue reading
So, often you need to get excel data into SQL ‘in a hurry’ for analysis. After you’ve done a base case analysis you can go about getting the data in a more ‘regular’ fashion. What’s the fastest way to get an excel into SQL? AceOleDb…that’s how. Continue reading
SQL 2012 introduced a new deployment model. Many of us, for various reasons didn’t move along with Microsoft…and continued to use and deploy the old SSIS Packages (.dtsx) … even to SQL 2012. So, that being the case, how do you get to these old packages?
VS 2012 is looking for “Project Deployment Files”…or for “Integration Services” catalogs. We didn’t have those things before 2012. And, we can operate without them, still. But, how? Continue reading
Hey, everyone! There’s a great event here in Jacksonville, Florida. It’s coming in May. Saturday, 9 May, to be specific. It is like a mini Tech-ed. Free Food. Great Technical presentations. Click the picture to the right to register!
Lots of networking. Fellowship. Swag. After hours party. What could be better than that? I will be speaking on a “Government Transparency Powered by SharePoint 2013 and Sql 2012”.
I know i have been AWOL now for some time… i have been heavily focused on bringing up Sharepoint 2013…and also empowering it with SQL 2012. And, actually there’s even more to it than that. One of the benefits of 2013 is that Microsoft has extended the value proposition by enabling the creation of a public site w/o real cost. Not only that…Microsoft says that you can access your SSAS components from that public site. Ok…think about that. That’s really big! That’s what we collectively thought as well.
HOWEVER, there are some caveats. Actually there are quite a few caveats that you can easily stumble into. Going forward, i expect to blog about these as we find them.
The first I’m going to mention has to do with Anonymous access to an excel file. It turns out that while other users aren’t limited (so far as I know today), Anonymous user can’t browse excel files greater than 10 MB! So, for context, we had built a PowerPivot based excel with some 180 MB of data. We published it to the sharepoint 2013 site. We went to the site ourselves (ie. IE knew our identity and passed it to SharePoint. Sharepoint showed us the file in a browser. Yeay! Success!
Hold your horses…check with anonymous! We anonymously logged in to the sharepoint public site, navigate to the Excel file, attempt to open, and ERROR! Arghh! Why would Microsoft do this? Well, it’s money, of course!
See here for more Microsoftian data:
More to follow!
Hey, everyone! There’s a great event here in Jacksonville, Florida. It’s coming in May. It is like a mini Tech-ed. Free Food. Great Technical presentations. Click the picture to the right to register!
Lots of networking. Fellowship. Swag. After hours party. What could be better than that? I will be speaking on a “How SQL 2012 Empowers BI in Sharepoint”.