Sunday, March 25, 2012
Access data sources from Rendering Extension code
implements IRenderingExtension).
I need to be able to enumerate and access [populated] datasets that were
used in the report. I don't see any way of getting a reference to DataSets
from Report or createAndRegisterStream objects, or other any other way. Can
you recommend something, please?
On the same token, if more than one dataset was defined in the report
designer, but only one was actually referenced from report items (i.e. fields
referenced all belong to one ds), does RS optimize away the other datasets,
or does it still execute queries and populate them?
Thank you!
Denis (DenisL at softartisans dot com)Hi Denis,
you cannot access populated datasets in the RenderingObjectModel. Just
consider the cases where a report is rendered from an execution or a history
snapshot. In both cases no data set queries are executed, but just the
information stored in the snapshot is exposed in the RenderingObjectModel.
Snapshots typically *only* contain the processed report items (in an output
format independent format) but no data.
As processing a report has multiple phases, datasets only get populated if
dataset queries get executed. In any case, populated datasets will never be
accessible through the RenderingObjectModel.
Regarding optimization of dataset executions: if a dataset is not referenced
by any reportitem in the entire report, then the commandtext is not
executed. If however it is referenced by a reportitem that can e.g. be
shown/hidden dynamically based on a condition, the query has to be executed
at processing time.
Right now, only a custom assembly could indirectly access populated datasets
(by passing in field values as parameters) during report processing.
In the next version there will be a way ("CustomReportItem") to "plug in"
your own custom processing controls into the report server which will have
access to the data at processing time. However, the custom processing
control will not be able to generate any arbitrary output format independent
rendering object (at least for the next version) - but it could generate a
custom image object with custom properties during processing which are then
exposed during rendering to your custom rendering extension.
HTH,
Robert
--
This posting is provided "AS IS" with no warranties, and confers no rights.
"Denis Levin" <DenisLevin@.discussions.microsoft.com> wrote in message
news:5C708846-5796-4D11-B377-76A0A7B22513@.microsoft.com...
> I am writing a rendering extension for reporting services (the one that
> implements IRenderingExtension).
> I need to be able to enumerate and access [populated] datasets that were
> used in the report. I don't see any way of getting a reference to DataSets
> from Report or createAndRegisterStream objects, or other any other way.
Can
> you recommend something, please?
> On the same token, if more than one dataset was defined in the report
> designer, but only one was actually referenced from report items (i.e.
fields
> referenced all belong to one ds), does RS optimize away the other
datasets,
> or does it still execute queries and populate them?
> Thank you!
> Denis (DenisL at softartisans dot com)|||Ths helps a lot!
Thank you, Robert.
"Robert Bruckner [MSFT]" wrote:
> Hi Denis,
> you cannot access populated datasets in the RenderingObjectModel. Just
> consider the cases where a report is rendered from an execution or a history
> snapshot. In both cases no data set queries are executed, but just the
> information stored in the snapshot is exposed in the RenderingObjectModel.
> Snapshots typically *only* contain the processed report items (in an output
> format independent format) but no data.
> As processing a report has multiple phases, datasets only get populated if
> dataset queries get executed. In any case, populated datasets will never be
> accessible through the RenderingObjectModel.
> Regarding optimization of dataset executions: if a dataset is not referenced
> by any reportitem in the entire report, then the commandtext is not
> executed. If however it is referenced by a reportitem that can e.g. be
> shown/hidden dynamically based on a condition, the query has to be executed
> at processing time.
> Right now, only a custom assembly could indirectly access populated datasets
> (by passing in field values as parameters) during report processing.
> In the next version there will be a way ("CustomReportItem") to "plug in"
> your own custom processing controls into the report server which will have
> access to the data at processing time. However, the custom processing
> control will not be able to generate any arbitrary output format independent
> rendering object (at least for the next version) - but it could generate a
> custom image object with custom properties during processing which are then
> exposed during rendering to your custom rendering extension.
> HTH,
> Robert
> --
> This posting is provided "AS IS" with no warranties, and confers no rights.
>
> "Denis Levin" <DenisLevin@.discussions.microsoft.com> wrote in message
> news:5C708846-5796-4D11-B377-76A0A7B22513@.microsoft.com...
> > I am writing a rendering extension for reporting services (the one that
> > implements IRenderingExtension).
> > I need to be able to enumerate and access [populated] datasets that were
> > used in the report. I don't see any way of getting a reference to DataSets
> > from Report or createAndRegisterStream objects, or other any other way.
> Can
> > you recommend something, please?
> >
> > On the same token, if more than one dataset was defined in the report
> > designer, but only one was actually referenced from report items (i.e.
> fields
> > referenced all belong to one ds), does RS optimize away the other
> datasets,
> > or does it still execute queries and populate them?
> >
> > Thank you!
> >
> > Denis (DenisL at softartisans dot com)
>
>
Thursday, March 22, 2012
Access a database from a client application
Hi all...
I am writing a Windows Application which will be used by about 100 clients. (NT Authenticated)
The application is going to be used in a LAN enviroment. User who have access will download a VB.NET application from a web site.
Of course, I want to prevent any of the 100 users from using a tool such as Query Analysis or ODBC to connect directly to the database and modify data.
So, it appears the way to go is to the "Application Roles". For this to work, the application roll password has to be saved in the application to execute the sp_setapprole procedure.
Is there a way to secure this password? I am worried about a user somehow extracting the password from the application and connecting directly.
Am I on the right track here? Any ideas appreciated!
Thanks,
Forch
One method I have used is to create an admin system where you would create a Hash of the application name and version + a secret string.... Then use this hash as the password for the Application Role. Doing it this way will also make sure that the users will use the current version only..
|||Note that if the user has full control over your application and can debug it, he can break any protection scheme that you build into it. It's worth obfuscating the password, as Glenn suggested, but this protection can be broken by a knowledgeable user.
Thanks
Laurentiu
Sunday, March 11, 2012
Acces to a file that is locked by another process
I am writing a package to process perfmon logs. The issue I have come across is that the perfmon process holds onto the log file and SSIS fails because it wants to exclusive read access. Is there any way of getting SSIS to not take an exclusive read on the file.
I can read the file in notepad fine but not SSIS.
check by changing the isolation level of package by default its serialiazable
Thursday, March 8, 2012
about xp_cmdshell..
I'm using xp_cmdshell in a loop to write in a file...
do you know better way to improve effiently writing on disk in a loop?
or any recommandations on xp_cmdshell uses (could we switch off the
msg feedbacks ?)
thanks a lot
++
VinceDTS, BCP, COM or .NET all seem likely to be more efficient ways of
writing to a file. What type of file are you trying to write?
David Portas
SQL Server MVP
--|||On 31 Mar 2005 01:18:03 -0800, "David Portas"
<REMOVE_BEFORE_REPLYING_dportas@.acm.org> wrote:
>DTS, BCP, COM or .NET all seem likely to be more efficient ways of
>writing to a file. What type of file are you trying to write?
I'm trying to write csv files. this, in a loop cursor which 'travels'
over 4 millions records so far...I mean, I must write row after row
because some of them are being value-changed during looping ...So I
could not use of BCP...
but what about DTS? can we use it in a loop?
thanks
Vince|||I don't understand why you think you need to do this cursor-style. Why can't
you do a set-based UPDATE for whatever data changes you require and THEN use
DTS, BCP or whatever other method to do the export?
Yes, you can export data row-by-row using DTS, but that doesn't necessarily
mean that's the best way to do it. BCP is probably a better choice for
millions of rows of data.
Comma-separated files are a poor medium for data at the best of times. For 4
million rows it seems like there ought to be a better way. What is the targe
t
system for the exported data? Can't it support other more structured file
formats? If you must delimit, avoid commas and use TAB or pipe characters
instead.
David Portas
SQL Server MVP
--|||Hello,
xp_CmdShell is not an efficient way of writing to a file. Instead you
can try with some Active-X Dll which can be instantiated by SQL Server
using sp_OACreate call. This may be a better way of doing "writing to
an external file" especially when there are large number of rows.
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
=3D=3D=3D=3D=3D SQL Snippet
Exec @.hr =3D sp_OACreate @.tProgId, @.iobToken Output
If @.hr <> 0 Goto TrapOAErrors
Exec @.hr =3D sp_OASetProperty @.iobToken, N'FileName', @.strFileName
If @.hr <> 0 Goto TrapOAErrors
Exec @.hr =3D sp_OAMethod @.iobToken, N'OpenFileForWriting'
If @.hr <> 0 Goto TrapOAErrors
--
--
Exec @.hr =3D sp_OAMethod @.iobToken, N'WriteBufferToFile', @.bytesBuffer
If @.hr <> 0 Goto TrapOAErrors
--
--
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3DSQL Snippet
Snippet VB Active-X DLL --
'--=3D=3DFunction for Opening file for Reading=3D=3D=BB
Public Function OpenFileForWriting()
hOrgFile =3D CreateFile(m_sFileName, GENERIC_READ, FILE_SHARE_READ Or
FILE_SHARE_WRITE, ByVal 0&, OPEN_EXISTING, 0, 0)
If hOrgFile <> 0 Then
SetFilePointer hOrgFile, 0, 0, FILE_BEGIN
End If
End Function
'--=3D=3DFunction for: Closing the File Handle=3D=3D=BB
Public Function CloseFile()
CloseHandle hOrgFile
'--=3D=3DClose the file=3D=3D=BB
End Function
'--=3D=3DFunction for Retrieving the FileReadBuffer=3D=3D=BB
Public Function WriteBufferToFile(tBuff() As Byte)
lRet As Long
Dim m_lOffset As Long
Err.Clear
'--=3D=3DClear the Error Device=3D=3D=
=BB
If hOrgFile =3D 0 Then
OpenFileForWriting
If hOrgFile =3D 0 Then
ReadBlockIntoBuffer =3D ""
Return
End If
End If
WriteFile hOrgFile, tBuff(1), UBound(tBuff), Ret, ByVal 0&
'--=3D=3DCheck for errors=3D=3D=BB
If lRet <> UBound(bBytes) Then 'Failed to READ the whole file
ErrorFlag =3D True
ErrorNumber =3D Err.Number
ErrorDesc =3D Err.Description
End If
End Function
Snippent Ends ---
The Above snippet is not complete and may not work :-)
Rgds ...
--Bju.|||If you have the option of including this in a .Net project, I would
recommend a nice normal select statement, using a sql reader, loop over
the reader and perform your row by row logic changes, and use my csv
parser/writer to create the file. The writer is not free however, but
is relatively cheap. At worst, I'd still recommend basically the same
way of doing it, using another way of creating the file in place of my
writer. The writer itself is going to create files at roughly 10 mb per
second.
http://www.geocities.com/shriop/index.html
Saturday, February 25, 2012
About Sql Statement
I am writing Procedure in SQL Server 2000, and i am giving three inputparameters ie: Account number and from date and to date.but in will give input to procedure as account number or from date and todate.
So in select command how can i write, ie i will give input any one ie accno or ftomdate and to date. i will write sql query which i write but it is giving error
Select
* From Mf_Tran_Reg
Where
mft_fundcd='RMF' and mft_purred='P'
if @.Folio = ''
begin
and mft_procdate between @.Fdate and @.tdate
end
else
begin
and mft_accno= @.Folio
end
Hi Majid,
CREATE PROCEDURE SomeProcedure
(
@.mft_accno INT = NULL,
@.Fdate DATETIME = NULL,
@.tdate DATETIME = NULL
)
AS
Select
* From Mf_Tran_Reg
Where
mft_fundcd='RMF' and mft_purred='P'
and
( mft_procdate between @.Fdate and @.tdate
OR mft_accno= @.Folio
)
As from your description both will not be passed, onyl either the Date or the number. If AccountNumber or dates can be NULL you will have to add the AND columnn is NULL to not give back the Nulled values.
HTH, Jens K. Suessmeyer.
http://www.sqlserver2005.de
Thursday, February 9, 2012
About Clickstream Datawarehouse
Hello:
I am a student from China.Now i am writing my graduation thesis which about Clickstream datawarehouse for e-commerce with SQL Server 2K5.But I just be a student,so i donot have a real webstation on Internet.And i cannot have the data about that,so now i cannot finish my graduation thesis.Can you have a sample about clickstream datawarehouse or weblog data?i want to design a date warehouse and analyse them with SQL Server 2K5. Just sample is ok.Or some code about this is ok.Thank you! You can contact with me:chentao0405@.yahoo.com.cn or MSN:forevertonychen@.hotmail.com .Thank you!
Although I don't have a demo data warehouse to share, I know that there have been a number of people doing clickstream analysis on web server logs using SQL 2005 Integration Services to extract data into tables in a data warehouse and then using Analysis Services Data Mining to mine the results. There is a related article on clickstream analysis that shows a screenshot of the state transitions using the Microsoft Sequence Clustering mining model at the link below. You may want to cross-post your question in the SQL Data Mining forum for any follow up questions.
http://www.sqlmag.com/Articles/ArticleID/44821/pg/2/2.html
|||Thank you for your help.