Skip to main content

Msg 15281, Level 16, State 1, Line 2 SQL Server blocked access to STATEMENT 'OpenRowset/OpenDatasource' of component 'Ad Hoc Distributed Queries' because this component is turned off as part of the security configuration for this server. A system administrator can enable the use of 'Ad Hoc Distributed Queries' by using sp_configure. For more information about enabling 'Ad Hoc Distributed Queries', search for 'Ad Hoc Distributed Queries' in SQL Server Books Online.

To overcome this issue, There are two methods, One is by using SQL Server Management Studio GUI interface and Other is by running Stored Procedures to enable program.

Method 1

Run Following Stored Procedures:


EXEC sp_configure 'show advanced options', 1;
GO
RECONFIGURE;
GO
EXEC sp_configure 'Ad Hoc Distributed Queries', 1;
GO
RECONFIGURE;

Explanation:

Here sp_configure displays or changes global configuration settings for the current server. 

By setting Show Advanced options to 1, we can list the advanced options by using sp_configure. The default is 0.

By Enabling Show Advanced options, we can Enable Ad Hoc Distributed Queries options.

Ad Hoc Distributed Queries: By default, SQL Server does not allow ad hoc distributed queries using OPENROWSET and OPENDATASOURCE. When this option is set to 1, SQL Server allows ad hoc access.

To enable Ad Hoc Distributed Queries  then prior to this, we need to enable Show Advanced options in sp_configure stored procedure.

RECONFIGURE:

Updates the currently configured value (the config_value column in the sp_configure result set) of a configuration option changed with the sp_configure system stored procedure. 

Because some configuration options require a server stop and restart to update the currently running value, RECONFIGURE does not always update the currently running value (the run_value column in the sp_configure result set) for a changed configuration value.

We can use SERVER CONFIGURATION Option present in SQL Server Management Studio.

In SQL Server 2012, SERVER CONFIGURATION can be found by Right click on Server and selecting FACETS option.

METHOD 2

Check the snippet.



Next From FACETS option, Select SERVER CONFIGURATIONS.



Now Select TRUE from Drop Down Option from AdHocRemoteQueriesEnabled as shown in Snippet below and Give Ok. Now the error is eliminated.









Comments

Popular posts from this blog

BIG Data, Hadoop – Chapter 2 - Data Life Cycle

Data Life Cycle The data life cycle is pictorial defined as show below:     As we see, in our current system, we capture/ Extract our data, then we store it and later we process for reporting and analytics. But in case of big data, the problem lies in storing and then processing it faster. Hence Hadoop takes this portion, where it stores the data in effective format (Hadoop distributed File System) and also process using its engine (Map Reduce Engine). Since Map Reduce engine or Hadoop engine need data on HDFS format to process, We have favorable tools available in market to do this operation. As an example, Scoop is a tool which converts RDBMS to HDFS. Likewise we have SAP BOD to convert sap system data to HDFS.

OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)" returned message "The Microsoft Access database engine cannot open or write to the file ''. It is already opened exclusively by another user, or you need permission to view and write its data.". Msg 7303, Level 16, State 1, Line 1 Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)".

OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)" returned message "The Microsoft Access database engine cannot open or write to the file ''. It is already opened exclusively by another user, or you need permission to view and write its data.". Msg 7303, Level 16, State 1, Line 1 Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)". If you get this error while Loading Data From Excel to SQL Server, then, close the Excel sheet opened and try to run queries again.

BIG Data, Hadoop – Chapter 1 - Understanding Big Data & Hadoop

Understanding Big Data We all in recent time, came across the word ‘Big Data’. So the question is what exactly is Big Data? How much TB or GB or data is called a Big Data? Well, there is no standard size definition for Big Data. If current system when not able to handle the data, then, we call such data as Big Data. (Big Data is just a terminology used in IT) As an example, if I take a text file of 50 GB, Processing a text file of 50 GB size on our Laptop or computer is not a huge task but if we take a smart phone, processing 10 GB of data is huge task. That means, for mobile phone, that 50 GB of data is Big Data. Understanding Hadoop Our current systems such as ETL tools, reporting tools, programming environment all have capability of handling few petabyte of Data. And the growth of data annually is shown below in chart And also the growth of unstructured, Semi structured data are increasingly every day. So there is a need of more adv...