What is the write constant [duplicate] - constants

I'm installing an application and want to set values for an ini file. Unfortunately, our main application is still built on a platform that gets redirected to the virtual store. Is there a way to get Inno Setup to store the ini file in the virtual store directly?

I believe there's even no Windows API to retrieve the path to the virtual store, let alone possibility to retrieve it reliably using Inno Setup.
But you can guess it to be {localappdata}\VirtualStore\path.
[Files]
Source: "MyProg.ini"; DestDir: "{code:GetVirtualStore|{app}}"
[Code]
function GetVirtualStore(Path: string): string;
var
Drive: string;
begin
Result := Path;
Drive := ExtractFileDrive(Path);
if CompareText(Drive, Copy(Path, 1, Length(Drive))) = 0 then
begin
Result := Copy(Result, Length(Drive) + 1, Length(Result) - Length(Drive));
Result := ExpandConstant('{localappdata}\VirtualStore') + Result;
end;
end;
You should probably also check that the path is on a system drive.

Related

Can I insert EDUPUB (like Zip file) in marklogic database through Rest API extension module

User will upload EDUPUB/Zip file from UI. We want to implement a REST api extension module to take EDUPUB/Zip file and ingest into a MarkLogic database. Does MarkLogic API api support this? Any suggestions?
I implemented the code below for extracting and uploading a EDUPUB/Zip file
xquery version "1.0-ml";
declare namespace zip="xdmp:zip";
declare function local:epubupload ($filepath as xs:string)
{
let $get_document :=xdmp:document-get($filepath)
let $get_uri := fn:document-uri($get_document)
let $get_document_uri := fn:concat($get_uri, "/")
let $get_collection := fn:tokenize($get_uri, "\\")[last()]
let $epub_extract := xdmp:zip-manifest($get_document)
for $each_file in $epub_extract/zip:part/text()
let $document_data := xdmp:zip-get($get_document, $each_file)
let $full_document_uri := fn:concat($get_document_uri, $each_file)
return xdmp:document-insert($full_document_uri, $document_data, (), $get_collection)
};
local:epubupload("c:\data\sample.epub")
But for the REST api what is the parameter? And how to get whole file from user system?
If you are creating your own REST extension, then you can use the following pattern on the zip payload:
1 Iterate over the zip file using xdmp:zip-manifest
2 For each entry, use xdmp:zip-get to extract the file
3 Save it into MarkLogic via xdmp:document-insert
Depending on how you posted the content, xdmp:base64-decode may be part of your code to actually get to your zip file.

HTML5 File API in Firefox Addon SDK

Is there a way to access Html5 file api in Fire Fox addon sdk in the content script?
This is needed in order to store user added words and their meanings. The data can grow large and so local storage isn't an option.
window.requestFileSystem3 = window.requestFileSystem || window.webkitRequestFileSystem;
gives me the error TypeError: window.requestFileSystem3 is not a function.
I am asking this because i am porting this code from a Google Chrome Extension which allows accessing the file api in a content script.
Additional Questions
1) If HTML5 File API is not allowed then should i use file module?
2) Does the file module allow access to any file on the file system as opposed to the Html5 file api which only access to a sandboxed access to file system?
3) Assuming i have to use file module what would be the best location to store my files ( like the user profile directory or extension directory ) and how would i get this path in code.
I apologize for so many sub questions inside this questions. Google wasn't very helpful regarding this topic.
Any sample code would be very helpful.
Firefox doesn't support writing files via File API yet and even when this will be added it will probably be accessible to web pages only and not extensions. In other words: yes, if you absolutely need to write to files then you should use low-level APIs. You want to store your data in the user profile directory (there is no extension directory, your extension is usually installed as a single packed file). Something like this should work to write a file:
var file = require("sdk/io/file");
var profilePath = require("sdk/system").pathFor("ProfD");
var filePath = file.join(profilePath, "foo.txt");
var writer = file.open(filePath, "w");
writer.writeAsync("foo!", function(error)
{
if (error)
console.log("Error: " + error);
else
console.log("Success!");
});
For reference: sdk/io/file, sdk/system
You could use TextReader.read() or file.read() to read the file. Unfortunately, Add-on SDK doesn't seem to support asynchronous file reading so the read will block the Firefox UI. The only alternative would be importing NetUtil and FileUtils via chrome authority, something like this:
var {components, Cu} = require("chrome");
var {NetUtil} = Cu.import("resource://gre/modules/NetUtil.jsm", null);
var {FileUtils} = Cu.import("resource://gre/modules/FileUtils.jsm", null);
NetUtil.asyncFetch(new FileUtils.File(filePath), function(stream, result)
{
if (components.isSuccessCode(result))
{
var data = NetUtil.readInputStreamToString(stream, stream.available());
console.log("Success: " + data);
}
else
console.log("Error: " + result);
});

How to implement "NSUserDefaults" functionality equivalent in Corona?

All I want is to save my user(player) highscores, and this information to persist between application(game) launches in Corona SDK (Lua). I do want it to work on iOS and Android nicely. My highscores data is actually two lua tables containing numbers.
What's the correct and easiest way to do it?
You may save the scores into a table, and then serialize it into json format text file.
local json=require("json")
local savefile="scores.json"
scores=
{
{
level=1,
status=0,
highscore=0,
},
{
level=2,
status=0,
highscore=0,
},
}
function getScore(filename, base)
-- set default base dir if none specified
if not base then
base = system.DocumentsDirectory
end
-- create a file path for corona i/o
local path = system.pathForFile(filename, base)
-- will hold contents of file
local contents
-- io.open opens a file at path. returns nil if no file found
local file = io.open(path, "r")
local scores
if file then
-- read all contents of file into a string
contents = file:read( "*a" )
if content ~= nil then
scores=json.decode(content)
end
io.close(file) -- close the file after using it
end
return scores
end
function saveScore(filename, base)
-- set default base dir if none specified
if not base then
base = system.DocumentsDirectory
end
-- create a file path for corona i/o
local path = system.pathForFile(filename, base)
-- io.open opens a file at path. returns nil if no file found
local file = io.open(path, "wb")
if file then
-- write all contents of file into a string
file:write(json.encode(scores))
io.close(file) -- close the file after using it
end
end
The global scores variable can be manipulated like a normal table, and when you want to load or save the scores table you can call the functions above.

InnoSetup: Is it possible to open my custom Delphi form (from the DLL) instead of the standard setup wizard

I need to create a complex form with my own components (kinda OneClick installer), and use it as the replacement of the standard InnoSetup wizard. Is it possible?
My form is placed into DLL, and this DLL will be available for InnoSetup process.
This is how I tried to do that:
Delphi DLL code
library OneClickWizard;
uses
SysUtils,
Classes,
Wizard in 'Wizard.pas' {FormWizard};
{$R *.res}
exports
CreateWizardForm,
DestroyWizardForm;
begin
end.
Delphi form
unit Wizard;
interface
type
TFormWizard = class(TForm)
private
{ Private declarations }
public
{ Public declarations }
end;
var
FormWizard: TFormWizard;
procedure CreateWizardForm(AppHandle: THandle); stdcall;
procedure DestroyWizardForm; stdcall;
implementation
{$R *.dfm}
procedure CreateWizardForm(AppHandle: THandle);
begin
Application.Handle := AppHandle;
FormWizard := TFormWizard.Create(Application);
FormWizard.Show;
FormWizard.Refresh;
end;
procedure DestroyWizardForm;
begin
FormWizard.Free;
end;
InnoSetup script (iss)
[Setup]
;Disable all of the default wizard pages
DisableDirPage=yes
DisableProgramGroupPage=yes
DisableReadyMemo=true
DisableReadyPage=true
DisableStartupPrompt=true
DisableWelcomePage=true
DisableFinishedPage=true
[Files]
Source:"OneClickWizard.dll"; Flags: dontcopy
[Code]
procedure CreateWizardForm(AppHandle: Cardinal);
external 'CreateWizardForm#files:OneClickWizard.dll stdcall';
procedure DestroyWizardForm;
external 'DestroyWizardForm#files:OneClickWizard.dll stdcall';
procedure InitializeWizard();
begin
CreateWizardForm(MainForm.Handle);
end;
The form appearing on the screen, but it doesn't react on my input. Seems it is out of main message cycle. How to do this correctly?
In my setup I do something similar.
InnoSetup code I pass the handle as StrToInt(ExpandConstant('{wizardhwnd}')) (my guess is that MainForm.Handle is zero)
in the DLL:
OldAppHandle := Application.Handle;
try
Application.Handle := hAppHandle; // hAppHandle the handle from InnoSetup
F := TfmZForm.Create(Application);
try
F.Caption := lpTitle;
F.ShowModal;
Result := F.ErrorCode;
finally
F.Free;
end;
finally
Application.Handle := OldAppHandle;
end;
I know precisely nothing about InnoSetup but surely you need to use ShowModal rather than Show here. Installation UI is invariably modal and what you want here is to wait until the user has finished iteracting with the form before you return to Inno. Otherwise, how would Inno know when to proceed? ShowModal runs a message loop to service the form so there will be no problems receiving input.
You would also change your DLL to remove DestroyWizardForm since the function that calls ShowModal can both create and destroy the form.
If you want to entirely replace the UI, it will probably be easier to create a stub application that presents the form then runs the normal setup in silent mode passing various command line parameters.
Either that or at the very least, using Inno's native form and wizard page functions/logic.

Reporting Services Deployment

I need to create a repeatable process for deploying SQL Server Reporting Services reports. I am not in favor of using Visual Studio and or Business Development Studio to do this. The rs.exe method of scripting deployments also seems rather clunky. Does anyone have a very elegant way that they have been able to deploy reports. The key here is that I want the process to be completely automated.
We use rs.exe, once we developed the script we have not needed to touch it anymore, it just works.
Here is the source (I slightly modified it by hand to remove sensitive data without a chance to test it, hope I did not brake anything), it deploys reports and associated images from subdirectories for various languages. Also datasource is created.
'=====================================================================
' File: PublishReports.rss
'
' Summary: Script that can be used with RS.exe to
' publish the reports.
'
' Rss file spans from beginnig of this comment to end of module
' (except of "End Module").
'=====================================================================
Dim langPaths As String() = {"en", "cs", "pl", "de"}
Dim filePath As String = Environment.CurrentDirectory
Public Sub Main()
rs.Credentials = System.Net.CredentialCache.DefaultCredentials
'Create parent folder
Try
rs.CreateFolder(parentFolder, "/", Nothing)
Console.WriteLine("Parent folder created: {0}", parentFolder)
Catch e As Exception
Console.WriteLine(e.Message)
End Try
PublishLanguagesFromFolder(filePath)
End Sub
Public Sub PublishLanguagesFromFolder(ByVal folder As String)
Dim Lang As Integer
Dim langPath As String
For Lang = langPaths.GetLowerBound(0) To langPaths.GetUpperBound(0)
langPath = langPaths(Lang)
'Create the lang folder
Try
rs.CreateFolder(langPath, "/" + parentFolder, Nothing)
Console.WriteLine("Parent lang folder created: {0}", parentFolder + "/" + langPath)
Catch e As Exception
Console.WriteLine(e.Message)
End Try
'Create the shared data source
CreateDataSource("/" + parentFolder + "/" + langPath)
'Publish reports and images
PublishFolderContents(folder + "\" + langPath, "/" + parentFolder + "/" + langPath)
Next 'Lang
End Sub
Public Sub CreateDataSource(ByVal targetFolder As String)
Dim name As String = "data source"
'Data source definition.
Dim definition As New DataSourceDefinition
definition.CredentialRetrieval = CredentialRetrievalEnum.Store
definition.ConnectString = "data source=" + dbServer + ";initial catalog=" + db
definition.Enabled = True
definition.EnabledSpecified = True
definition.Extension = "SQL"
definition.ImpersonateUser = False
definition.ImpersonateUserSpecified = True
'Use the default prompt string.
definition.Prompt = Nothing
definition.WindowsCredentials = False
'Login information
definition.UserName = "user"
definition.Password = "password"
Try
'name, folder, overwrite, definition, properties
rs.CreateDataSource(name, targetFolder, True, definition, Nothing)
Catch e As Exception
Console.WriteLine(e.Message)
End Try
End Sub
Public Sub PublishFolderContents(ByVal sourceFolder As String, ByVal targetFolder As String)
Dim di As New DirectoryInfo(sourceFolder)
Dim fis As FileInfo() = di.GetFiles()
Dim fi As FileInfo
Dim fileName As String
For Each fi In fis
fileName = fi.Name
Select Case fileName.Substring(fileName.Length - 4).ToUpper
Case ".RDL"
PublishReport(sourceFolder, fileName, targetFolder)
Case ".JPG", ".JPEG"
PublishResource(sourceFolder, fileName, "image/jpeg", targetFolder)
Case ".GIF", ".PNG", ".BMP"
PublishResource(sourceFolder, fileName, "image/" + fileName.Substring(fileName.Length - 3).ToLower, targetFolder)
End Select
Next fi
End Sub
Public Sub PublishReport(ByVal sourceFolder As String, ByVal reportName As String, ByVal targetFolder As String)
Dim definition As [Byte]() = Nothing
Dim warnings As Warning() = Nothing
Try
Dim stream As FileStream = File.OpenRead(sourceFolder + "\" + reportName)
definition = New [Byte](stream.Length) {}
stream.Read(definition, 0, CInt(stream.Length))
stream.Close()
Catch e As IOException
Console.WriteLine(e.Message)
End Try
Try
'name, folder, overwrite, definition, properties
warnings = rs.CreateReport(reportName.Substring(0, reportName.Length - 4), targetFolder, True, definition, Nothing)
If Not (warnings Is Nothing) Then
Dim warning As Warning
For Each warning In warnings
Console.WriteLine(warning.Message)
Next warning
Else
Console.WriteLine("Report: {0} published successfully with no warnings", targetFolder + "/" + reportName)
End If
Catch e As Exception
Console.WriteLine(e.Message)
End Try
End Sub
Public Sub PublishResource(ByVal sourceFolder As String, ByVal resourceName As String, ByVal resourceMIME As String, ByVal targetFolder As String)
Dim definition As [Byte]() = Nothing
Dim warnings As Warning() = Nothing
Try
Dim stream As FileStream = File.OpenRead(sourceFolder + "\" + resourceName)
definition = New [Byte](stream.Length) {}
stream.Read(definition, 0, CInt(stream.Length))
stream.Close()
Catch e As IOException
Console.WriteLine(e.Message)
End Try
Try
'name, folder, overwrite, definition, MIME, properties
rs.CreateResource(resourceName, targetFolder, True, definition, resourceMIME, Nothing)
Console.WriteLine("Resource: {0} with MIME {1} created successfully", targetFolder + "/" + resourceName, resourceMIME)
Catch e As Exception
Console.WriteLine(e.Message)
End Try
End Sub
Here is the batch to call the rs.exe:
SET ReportServer=%1
SET DBServer=%2
SET DBName=%3
SET ReportFolder=%4
rs -i PublishReports.rss -s %ReportServer% -v dbServer="%DBServer%" -v db="%DBName%" -v parentFolder="%ReportFolder%" >PublishReports.log 2>&1
pause
I used the script #David supplied but I had to add some code (I'm typing this up as an answer, as this would be too long for a comment.
The problem is: if there is already a "shared datasource" attached to a report in the report definition, this is never the same datasource as the one that is created in the script.
This also becomes apparent from the warning emitted by the "CreateReport" method:
The data set '' refers to the shared data source '', which is not published on the report server.
So the data source has to be set explicitly afterwards. I've made the following code changes:
I added a global variable:
Dim dataSourceRefs(0) As DataSource
At the end of the CreateDataSource method, that variable gets filled:
Dim dsr As New DataSourceReference
dsr.Reference = "/" + parentFolder + "/" + db
Dim ds As New DataSource
ds.Item = CType(dsr, DataSourceDefinitionOrReference)
ds.Name = db
dataSourceRefs(0) = ds
And in the PublishReport method, that data source gets explicitly set (after CreateReport has been called):
rs.SetItemDataSources(targetFolder + "/" + reportName.Substring(0, reportName.Length - 4), dataSourceRefs)
Note that this last call is only RS 2005 or higher. If you want to load your reports onto a RS 2000 server, you have to use SetReportDataSources in stead:
rs.SetReportDataSources(targetFolder + "/" + reportName.Substring(0, reportName.Length - 4), dataSourceRefs)
Well not really elegant. We created our own tool that uses the reportingservices2005 web service. We found this to be the most reliable way of getting what we want.
It's not really that difficult and lets you expand it to do other things like creating data sources and folders as required.
I strongly recommend RSScripter. As noted in the overview:
Reporting Services Scripter is a .NET
Windows Forms application that enables
scripting and transfer of all
Microsoft SQL Server Reporting
Services catalog items to aid in
transferring them from one server to
another. It can also be used to easily
move items on mass from one Reporting
Services folder to another on the same
server. Depending on the scripting
options chosen, Reporting Services
Scripter can also transfer all catalog
item properties such as Descriptions,
History options, Execution options
(including report specific and shared
schedules), Subscriptions (normal and
data driven) and server side report
parameters.
I know you say that you're not in favor of the Business Development Studio to do this, but I've found the built-in tools to be very reliable and easy to use.
Have you looked into any Continuous Integration solutions such as CruiseControl.NET? If you are able to deploy Reports using rs.exe then you can setup an automated process in CruiseControl to build and deploy your Reports on a timer or whenever a report is modified.
In our environment, we develop in VS with version control then deploy to DEV SSRS. Once the report is validated, we use ReportSync program to deploy reports from ReportServer DEV to ReportServer PROD. The RS.EXE scripts still have their place, but I have found ReportSync to be a much simpler and agile way to promote a report.
ReportSync:
ReportSync is an open source program free to download and use. It works great for downloading reports in bulk, and it can even push a report from one server to another server.
How to get download the program?
Download the source code files from Github: Phires/ReportSynch, Run VS, Open the solution file (.SLN), compile the program, find the executable file (.EXE) from the C:\Temp\reportsync-master\bin\Release folder. Finally, saved the .EXE somewhere for you to use regularly
How do I copy SSRS reports to a new server if I am not the owner of the reports --> ReportSync answer by nunespascal
How to deploy a report?
Run the executable and the interface will launch.
Use the SOURCE and DESTINATION dialogues to choose a single report, multiple reports, or an entire folder of reports. You can any target folder you would like. (HINT: You can even target the same server if you are wanting to duplicate a report on the same server.)
After making your selections press the Sync button
Go to the target server, and validate the change took effect by reviewing the Changed By Date.
This tool has been very convenient, but I have noticed some quirks. For example when I want to update just one report that already exists in the destination, here is what I have to select-- [Source:Report> Target:Folder> Sync]. WARNING: You might think you would select the target server report to update it, but I have tried this and the report does not get updated.
What else can ReportSync do?
There is also an Export feature, which works marvelously for simply dumping all the RDL files to a folder for me to access. This is helpful in the event you need to migrate the server, add the files to to a VS Solution Project, or do anything else will all the files.
In my testing this program does not migrate other content-- subscriptions, shared data sources, shared data sets. It is just applicable to the report files.
I know this post is old, but I came across it when researching RS.EXE scripts, so I thought I would provide an answer this question.