Saving to yml file using Spigot - plugins

I'm attempting to produce a Message.yml file using Spigot's YAMLConfiguration.
This is my code:
public static void create() {
if(messagesFile.exists()) return;
try {
messagesFile.createNewFile();
messages.options().copyDefaults(true);
messages.addDefault("MESSAGES.PREFIX", "&c[YourServer] ");
messages.addDefault("MESSAGES.DESIGN", "§8§l- ");
messages.addDefault("MESSAGES.NOPERMS", "§c§lDazu hast du keine Rechte!");
messages.addDefault("MESSAGES.ADDMAP.USAGE", "§c§lBitte nutze /addmap [mapname]!");
messages.save(messagesFile);
} catch(Exception e) {
e.printStackTrace();
}
}
However, the config.yml file I received after running it read as follows:
MESSAGES:
PREFIX: '&c[YourServer] '
DESIGN: "\xa78\xa7l- "
NOPERMS: "\xa7c\xa7lDazu hast du keine Rechte!"
ADDMAP:
USAGE: "\xa7c\xa7lBitte nutze /addmap [mapname]!"
Is there any way to fix it?

It thinks the text is a string and not a standalone character.
https://www.spigotmc.org/threads/special-characters-in-config.298138/

Yeah u use special caracter to save the color but it's a String. Don't put your color here just save the String. When you want to resend the text from the config just put for example.
player.sendMessage(ChatColor.RED + config.get("MESSAGES.PREFIX"));
this is just an example

Like #Minecraft said in his answer, the issue is that Java is recognizing the § as a part of your string and translating it to unicode.
What I would do is have your custom config file stored in your plugin resources directory with all the default values you want it to have already defined.
Then when you want to use the custom message, get it from the config file using getConfig()'s returned value's methods. Then, if you want to support color codes, you should use message = ChatColor.translateAlternateColorCodes('&', yourMessage); or something along those lines. Should be plenty to get you going.
Also, be sure and use a unified symbol for these color codes (default is &), but you can set it in the aforementioned method translateAlternativeColorCodes(). You seem to be using & or §, I would stick to &.
Sources:
https://www.spigotmc.org/wiki/config-files/#using-custom-configurations
https://hub.spigotmc.org/javadocs/bukkit/org/bukkit/ChatColor.html#translateAlternateColorCodes(char,java.lang.String)

Related

WordprocessingDocument.CreateFromTemplate method creates corrupted MS Word files

I have proper .dotm template.
When I create a new file based on a template by double clicking in explorer it creates the correct file (based on this template). Created file size after save is 16Kb (without any content).
But if I want to use .CreateFromTemplate method in my code I cannot open a newly created .docx file in MS Word.
New file size is 207Kb (just like .dotm file). MS Word display "run-time error 5398" and not open the file.
I'm using nuget package DocumentFormat.OpenXml 2.19.0, Word 365 version 16.0.14931.20648 - 32bit and code like this:
using (WordprocessingDocument doc = WordprocessingDocument.CreateFromTemplate(templatePath))
{
doc.SaveAs(newFileName);
}
Google is silent about this error, ChatGPT says that:
The "Run-time Error 5398" error means that the file you are trying to open is corrupted or not a valid docx file. Possible reasons for this error may be the following:
The file was not saved correctly after making changes. Verify that the Save() method was called after making changes to the file.
The file was saved with the wrong extension, e.g. as DOTM instead of DOCX
The file was saved in an invalid format.
There may have been some unhandled exceptions in your code.
When I manually change the extension of a new file from docx to dotm, there is no error when opening, but the file does not open.
What am I doing wrong with CreateFromTemplate method?
I tried to reproduce the behavior you described, using the following unit tests:
public sealed class CreateFromTemplateTests
{
private readonly ITestOutputHelper _output;
public CreateFromTemplateTests(ITestOutputHelper output)
{
_output = output;
}
[Theory]
[InlineData("c:\\temp\\MacroEnabledTemplate.dotm", "c:\\temp\\MacroEnabledDocument.docm")]
[InlineData("c:\\temp\\Template.dotx", "c:\\temp\\Document.docx")]
public void CanCreateDocmFromDotm(string templatePath, string documentPath)
{
// Let's not attach the template, which is done by default. If a template is attached, the validator complains as follows:
// The element has unexpected child element 'http://schemas.openxmlformats.org/wordprocessingml/2006/main:attachedTemplate'.
using (var wordDocument = WordprocessingDocument.CreateFromTemplate(templatePath, false))
{
// Validate the document as created with CreateFromTemplate.
ValidateOpenXmlPackage(wordDocument);
// Save that document to disk so we can open it with Word, for example.
wordDocument.SaveAs(documentPath).Dispose();
}
using (WordprocessingDocument wordDocument = WordprocessingDocument.Open(documentPath, true))
{
// Validate the document that was opened from disk, just to see what Word would open.
ValidateOpenXmlPackage(wordDocument);
}
}
private void ValidateOpenXmlPackage(OpenXmlPackage openXmlPackage)
{
OpenXmlValidator validator = new(FileFormatVersions.Office2019);
List<ValidationErrorInfo> validationErrors = validator.Validate(openXmlPackage).ToList();
foreach (ValidationErrorInfo validationError in validationErrors)
{
_output.WriteLine(validationError.Description);
}
if (validationErrors.Any())
{
// Note that Word will most often be able to open the document even if there are validation errors.
throw new Exception("The validator found validation errors.");
}
}
}
In both tests, the documents are created without an issue. Looking at the Open XML markup, both documents look fine. However, while I don't get any runtime error, Word also does not open the macro-enabled document.
I am not sure why that happens. It might be related to your security settings.
Depending on whether or not you really need to use CreateFromTemplate(), you could create a .docm (rather than a .dotm) and create new macro-enabled documents by copying that .docm.
I opened an issue in the Open XML SDK project on GitHub.

Changing working directory in Scala [duplicate]

How can I change the current working directory from within a Java program? Everything I've been able to find about the issue claims that you simply can't do it, but I can't believe that that's really the case.
I have a piece of code that opens a file using a hard-coded relative file path from the directory it's normally started in, and I just want to be able to use that code from within a different Java program without having to start it from within a particular directory. It seems like you should just be able to call System.setProperty( "user.dir", "/path/to/dir" ), but as far as I can figure out, calling that line just silently fails and does nothing.
I would understand if Java didn't allow you to do this, if it weren't for the fact that it allows you to get the current working directory, and even allows you to open files using relative file paths....
There is no reliable way to do this in pure Java. Setting the user.dir property via System.setProperty() or java -Duser.dir=... does seem to affect subsequent creations of Files, but not e.g. FileOutputStreams.
The File(String parent, String child) constructor can help if you build up your directory path separately from your file path, allowing easier swapping.
An alternative is to set up a script to run Java from a different directory, or use JNI native code as suggested below.
The relevant OpenJDK bug was closed in 2008 as "will not fix".
If you run your legacy program with ProcessBuilder, you will be able to specify its working directory.
There is a way to do this using the system property "user.dir". The key part to understand is that getAbsoluteFile() must be called (as shown below) or else relative paths will be resolved against the default "user.dir" value.
import java.io.*;
public class FileUtils
{
public static boolean setCurrentDirectory(String directory_name)
{
boolean result = false; // Boolean indicating whether directory was set
File directory; // Desired current working directory
directory = new File(directory_name).getAbsoluteFile();
if (directory.exists() || directory.mkdirs())
{
result = (System.setProperty("user.dir", directory.getAbsolutePath()) != null);
}
return result;
}
public static PrintWriter openOutputFile(String file_name)
{
PrintWriter output = null; // File to open for writing
try
{
output = new PrintWriter(new File(file_name).getAbsoluteFile());
}
catch (Exception exception) {}
return output;
}
public static void main(String[] args) throws Exception
{
FileUtils.openOutputFile("DefaultDirectoryFile.txt");
FileUtils.setCurrentDirectory("NewCurrentDirectory");
FileUtils.openOutputFile("CurrentDirectoryFile.txt");
}
}
It is possible to change the PWD, using JNA/JNI to make calls to libc. The JRuby guys have a handy java library for making POSIX calls called jnr-posix. Here's the maven info
As mentioned you can't change the CWD of the JVM but if you were to launch another process using Runtime.exec() you can use the overloaded method that lets you specify the working directory. This is not really for running your Java program in another directory but for many cases when one needs to launch another program like a Perl script for example, you can specify the working directory of that script while leaving the working dir of the JVM unchanged.
See Runtime.exec javadocs
Specifically,
public Process exec(String[] cmdarray,String[] envp, File dir) throws IOException
where dir is the working directory to run the subprocess in
If I understand correctly, a Java program starts with a copy of the current environment variables. Any changes via System.setProperty(String, String) are modifying the copy, not the original environment variables. Not that this provides a thorough reason as to why Sun chose this behavior, but perhaps it sheds a little light...
The working directory is a operating system feature (set when the process starts).
Why don't you just pass your own System property (-Dsomeprop=/my/path) and use that in your code as the parent of your File:
File f = new File ( System.getProperty("someprop"), myFilename)
The smarter/easier thing to do here is to just change your code so that instead of opening the file assuming that it exists in the current working directory (I assume you are doing something like new File("blah.txt"), just build the path to the file yourself.
Let the user pass in the base directory, read it from a config file, fall back to user.dir if the other properties can't be found, etc. But it's a whole lot easier to improve the logic in your program than it is to change how environment variables work.
I have tried to invoke
String oldDir = System.setProperty("user.dir", currdir.getAbsolutePath());
It seems to work. But
File myFile = new File("localpath.ext");
InputStream openit = new FileInputStream(myFile);
throws a FileNotFoundException though
myFile.getAbsolutePath()
shows the correct path.
I have read this. I think the problem is:
Java knows the current directory with the new setting.
But the file handling is done by the operation system. It does not know the new set current directory, unfortunately.
The solution may be:
File myFile = new File(System.getPropety("user.dir"), "localpath.ext");
It creates a file Object as absolute one with the current directory which is known by the JVM. But that code should be existing in a used class, it needs changing of reused codes.
~~~~JcHartmut
You can use
new File("relative/path").getAbsoluteFile()
after
System.setProperty("user.dir", "/some/directory")
System.setProperty("user.dir", "C:/OtherProject");
File file = new File("data/data.csv").getAbsoluteFile();
System.out.println(file.getPath());
Will print
C:\OtherProject\data\data.csv
You can change the process's actual working directory using JNI or JNA.
With JNI, you can use native functions to set the directory. The POSIX method is chdir(). On Windows, you can use SetCurrentDirectory().
With JNA, you can wrap the native functions in Java binders.
For Windows:
private static interface MyKernel32 extends Library {
public MyKernel32 INSTANCE = (MyKernel32) Native.loadLibrary("Kernel32", MyKernel32.class);
/** BOOL SetCurrentDirectory( LPCTSTR lpPathName ); */
int SetCurrentDirectoryW(char[] pathName);
}
For POSIX systems:
private interface MyCLibrary extends Library {
MyCLibrary INSTANCE = (MyCLibrary) Native.loadLibrary("c", MyCLibrary.class);
/** int chdir(const char *path); */
int chdir( String path );
}
The other possible answer to this question may depend on the reason you are opening the file. Is this a property file or a file that has some configuration related to your application?
If this is the case you may consider trying to load the file through the classpath loader, this way you can load any file Java has access to.
If you run your commands in a shell you can write something like "java -cp" and add any directories you want separated by ":" if java doesnt find something in one directory it will go try and find them in the other directories, that is what I do.
Use FileSystemView
private FileSystemView fileSystemView;
fileSystemView = FileSystemView.getFileSystemView();
currentDirectory = new File(".");
//listing currentDirectory
File[] filesAndDirs = fileSystemView.getFiles(currentDirectory, false);
fileList = new ArrayList<File>();
dirList = new ArrayList<File>();
for (File file : filesAndDirs) {
if (file.isDirectory())
dirList.add(file);
else
fileList.add(file);
}
Collections.sort(dirList);
if (!fileSystemView.isFileSystemRoot(currentDirectory))
dirList.add(0, new File(".."));
Collections.sort(fileList);
//change
currentDirectory = fileSystemView.getParentDirectory(currentDirectory);

Which file contains function of core/session in magento?

I need to customize other's code,
so I found they used
Mage::getSingleton('core/session')->getMyCustomBlockInfo();
in Order.php file for custom order email
so I can't find this function getMyCustomBlockInfo();
Can anyone tell me where this function reside?
Thanks
those are magic functions get() and set() and you are asking a session variable there that is set as
Mage::getSingleton('core/session')->setMyCustomBlockInfo();
somewhere in your code. If you use terminal you can easily find by making a following grep:
grep '>setMyCustomBlockInfo(' . -rsni
and it will list the files where your variable is set to session.
example :
Mage::getModel('catalog/product'); //or
Mage::getSingleton('catalog/product');
the code must be in '../app/core/Mage/Catalog/Model/Product.php' file
then
Mage::getSingleton('core/session');
the code must be in '../app/core/Mage/Core/Model/Session.php' file
because the class Mage_Core_Model_Session's parent::parent is Varien_Object, then you can do all magic functions and you can ->getData() to see the Data inside.
Mage::getSingleton('core/session')->getData();
on your problem when go call ->getData() you can see data : [my_custom_block_info]
you can set it with call
Mage::getSingleton('core/session')->setMyCustomBlockInfo('what');
Mage::getSingleton('core/session')->getMyCustomBlockInfo();
// will return 'what'

Zend_Pdf - load utf-8 pdf document

I'm currently trying to load a PDF document using the Zend_Pdf::load($filename) method and I'm getting
Error occured while 'xxx.pdf' file reading.
So I see in Zend_Pdf_Parser::_construct there is this block
while ($byteCount > 0 && !feof($pdfFile)) {
$nextBlock = fread($pdfFile, $byteCount);
if ($nextBlock === false) {
require_once 'Zend/Pdf/Exception.php';
throw new Zend_Pdf_Exception( "Error occured while '$source' file reading." );
}
$data .= $nextBlock;
$byteCount -= strlen($nextBlock);
}
if ($byteCount != 0) {
require_once 'Zend/Pdf/Exception.php';
throw new Zend_Pdf_Exception( "Error occured while '$source' file reading." );
}
After debugging, I can tell that strlen($nextBlock) is not returning the right value (based on $nextBlock = fread($pdfFile, $byteCount); )
If I use mb_strlen($nextBlock,'8bit') instead this block passes right. Now I'm getting another error
Pdf file syntax error. 'startxref' keyword expected
So now I look into Zend_Pdf_StringParser:readLexeme() and I can see that again there is a problem with singlebyte vs. multibyte string functions (strlen etc.)
So does anybody have a clue what's going on with Zend_Pdf, if this is general bug or I'm just missing something?
I never used Zend_PDF because it has very few potential. I advise you to integrate into your project TCPDF! ;)
I experienced the same error and it turned out to be a bug in Zend Guard. Apparently my version of the PHP encoder turns the ASCII NP form feed character (\f) inside string literals into the backslash (\) and the f characters (\\f).
The obfuscated version of
print bin2hex("\f");
outputs
5c66
instead of the expected
0c
This behavior causes Zend_Pdf_StringParser to parse 'startxre' instead of 'startxref' in readLexeme, causing the error you described.
If you are using a different version of the encoder or no encoder at all, then this may not be the cause of the problem (try reproducing it on a different PHP version).

ICU custom transliteration

I am looking to utilize the ICU library for transliteration, but I would like to provide a custom transliteration file for a set of specific custom transliterations, to be incorporated into the ICU core at compile time for use in binary form elsewhere. I am working with the source of ICU 4.2 for compatibility reasons.
As I understand it, from the ICU Data page of their website, one way of going about this is to create the file trnslocal.mk within ICUHOME/source/data/translit/ , and within this file have the single line TRANSLIT_SOURCE_LOCAL=custom.txt.
For the custom.txt file itself, I used the following format, based on the master file root.txt:
custom{
RuleBasedTransliteratorIDs {
Kanji-Romaji {
file {
resource:process(transliterator){"custom/Kanji_Romaji.txt"}
direction{"FORWARD"}
}
}
}
TransliteratorNamePattern {
// Format for the display name of a Transliterator.
// This is the language-neutral form of this resource.
"{0,choice,0#|1#{1}|2#{1}-{2}}" // Display name
}
// Transliterator display names
// This is the English form of this resource.
"%Translit%Hex" { "%Translit%Hex" }
"%Translit%UnicodeName" { "%Translit%UnicodeName" }
"%Translit%UnicodeChar" { "%Translit%UnicodeChar" }
TransliterateLATIN{
"",
""
}
}
I then store within the directory custom the file Kanji_Romaji.txt, as found here. Because it uses > instead of the → I have seen in other files, I converted each entry appropriately, so they now look like:
丁 → Tei ;
七 → Shichi ;
When I compile the ICU project, I am presented with no errors.
When I attempt to utilize this custom transliterator within a testfile, however (a testfile that works fine with the in-built transliterators), I am met with the error error: 65569:U_INVALID_ID.
I am using the following code to construct the transliterator and output the error:
UErrorCode status = U_ZERO_ERROR;
Transliterator *K_R = Transliterator::createInstance("Kanji-Romaji", UTRANS_FORWARD, status);
if (U_FAILURE(status))
{
std::cout << "error: " << status << ":" << u_errorName(status) << std::endl;
return 0;
}
Additionally, a loop through to Transliterator::countAvailableIDs() and Transliterator::getAvailableID(i) does not list my custom transliteration. I remember reading with regard to custom converters that they must be registered within /source/data/mappings/convrtrs.txt . Is there a similar file for transliterators?
It seems that my custom transliterator is either not being built into the appropriate packages (though there are no compile errors), is improperly formatted, or somehow not being registered for use. Incidentally, I am aware of the RuleBasedTransliterator route at runtime, but I would prefer to be able to compile the custom transliterations for use in any produced binary.
Let me know if any additional clarification is necessary. I know there is at least one ICU programmer on here, who has been quite helpful in other posts I have written and seen elsewhere as well. I would appreciate any help I can find. Thank you in advance!
Transliterators are sourced from CLDR - you could add your transliterator to CLDR (the crosswire directory contains it in XML format in the cldr/ directory) and rebuild ICU data. ICU doesn't have a simple mechanism for adding transliterators as you are trying to do. What I would do is forget about trnslocal.mk or custom.txt as you don't need to add any files, and simply modify root.txt - you might file a bug if you have a suggested improvement.