Test suite failed to run using #mui/x-data-grid - material-ui

I am using #mui-x-data-grid in my React component
I have written a test file which is using the above component.
But test fails when deployed to higher env (Works in local)
Below is the error.
Test suite failed to run
SyntaxError: Invalid regular expression: /^(\p{L}|\p{M}\p{L}|\p{M}|\p{N}|\p{Z}|\p{S}|\p{P})$/: Invalid escape
at Object. (node_modules/#mui/x-data-grid/index-cjs.js:15:41734)
When I check the node_modules/#mui/x-data-grid/index-cjs.js file , it has a regular expression in it.
How can I fix this issue?
I am using -
"#mui/material": "^5.2.4",
"#mui/x-data-grid": "^5.2.0",

Related

How to fail a test case in Test if any js error is found by Test Rigor?

I am running a test case in my Test Rigor suite.It is getting passed but I can see some errors in the Test steps that were executed.
Error: JS SEVERE: 52:61 Uncaught TypeError: Cannot read properties of undefined (reading 'toString')
Issue is that Test Rigor is passing all these test cases.I want it to fail the test case in case any such errors are found.Is there any way to do this?
The fail with "error" command will fail a script. To do it within javascript,
try:
testRigor.execute('fail with "error"');
For javascript errors detected on a page, testcases will note those errors by changing the color of the screenshot, however, they will not fail the test as they are exposed as "minor" errors. There is currently no way to detect these errors. You could put in an enhancement request.

Error runing mftf in Magento 2.3.5 I get an error on build:project

First time eveer runing mftf I installed with composer with in a m2 2.3.5 and when I run the build I get this message:
mftf build:project
mftf files removed from filesystem.
codeception.yml applied to /var/www/html/dev/tests/acceptance/codeception.yml
functional.suite.yml configuration successfully applied.
functional.suite.yml applied to /var/www/html/dev/tests/acceptance/tests/functional.suite.yml
command.php copied to /var/www/html/dev/tests/acceptance/utils/command.php
.credentials.example successfully applied.
.env configuration successfully applied.
In Configuration.php line 212:
Output path is not defined by key "paths: output"
In BuildProjectCommand.php line 105:
The codecept build command failed unexpectedly. Please see the above output for more details.
Any ideas what can be ?

Parse error generating server stub with openapitools/openapi-generator-cli using OAS 3.0

I am trying to generate server code using openapitools/openapi-generator-cli which I installed globally using NPM.
When I run the command:
openapi-generator generate -i MyApi.yaml -g aspnetcore -o ./src
I get the following error:
[main] ERROR i.s.parser.SwaggerCompatConverter - failed to read resource listing
com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'openapi': was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false')
I have also tried converting my spec file to json and encountered the same error.
How can I resolve this error with parsing the yaml file?
I ran my spec file through the online editor at http://editor.swagger.io/ and found an error in my yaml (I forgot to add a parameter entry for a path with a parameter in the path). Once I fixed the error, the generator worked correctly.
So this was user error, though the error message could be better.

Commands invalid after 'import_board_preset' command

Currently I am trying to follow the MathWorks tutorial 1 to register a TE0720 with a TE0701-6 carrier board in Matlab. I followed the instructions, designed the block design and exported it as advised. Using the Matlab HDL Workflow Advisor I can follow unitl step 4.1 Create Project. Here, I get the following error message:
invalid command name "CONFIG.PCW_INCLUDE_ACP_TRANS_CHECK"
while executing
"CONFIG.PCW_INCLUDE_ACP_TRANS_CHECK {0} CONFIG.PCW_IOPLL_CTRL_FBDIV {30} CONFIG.PCW_IO_IO_PLL_FREQMHZ {1000.000} CONFIG.PCW_IRQ_F2P_INTR {1} CONFIG..."
(procedure "create_root_design" line 49)
invoked from within
"create_root_design """
(file "vivado_custom_block_design.tcl" line 986)
while executing
"source vivado_custom_block_design.tcl"
(file "vivado_create_prj.tcl" line 15)
This is regarding the exported block design in the corresponding *.tlc file.
After deleting the line mentioned in the error, the error persists, but for the following line. This holds true until I deleted all lines following
CONFIG.PCW_IMPORT_BOARD_PRESET {preset}
It seems to me that once the preset for the board is imported, all following commands are seen as invalid. If I put this line in the end of the list though, I get the error
ERROR [Common 17-69] Command failed: Missing name/value pair in -dict argument.
If I remove this line, I get the error
ERROR [BD 41-1811] The interconnect </axi_interconnect_0> is missing a valid master interface connection
ERROR [Common 17-39] 'validate_bd_design' failed due to earlier errors.
Is there a way to fix this or what is the problem here?
EDIT: I am using Vivado 2017.4 from the Vivado HL WebPACK. Could it be that there is a feature not available in this version for rebuilding the project as MATLAB intends to do?
EDIT 2: I started the complete tutorial fresh from scratch again and now I only get the error
ERROR: [BD 41-1811] The interconnect </axi_interconnect_0> is missing a valid master Interface connection
when going throught the HDL Workflow Advisor. As far as I understand the issue, Vivado searches for something to connect the axi_interconnect to. But isn't this the interface port (DUT) as described later in the tutorial (end of step 2 in Register the custom reference design in HDL Workflow Advisor, where the compiled simulink model should be connected?

FATAL org.apache.hadoop.conf.Configuration - error parsing conf file: org.xml.sax.SAXParseException

I'm trying to run pig locally, installed using homebrew, to test a script. However, I get the following error when I attempt to run a simple dump from the interactive prompt pig -x local:
2012-07-16 23:20:40,447 [Thread-7] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths (combined) to process : 1
[Fatal Error] :63:85: Character reference "&#2" is an invalid XML character.
2012-07-16 23:20:40,688 [Thread-7] FATAL org.apache.hadoop.conf.Configuration - error parsing conf file: org.xml.sax.SAXParseException: Character reference "&#2" is an invalid XML character.
The same load/dump works fine on Elastic MapReduce.
I can't find any XML config files, and I've tried with both version 0.9.2 and 0.10.0
What am I missing?
Edit: Just checked a direct download (vs. homebrew) and it doesn't seem to work either
You should check that your Hadoop configuration files have correct configuration data.
Have a look in your hadoop/conf directory.
Have a look inside:
hdfs-site.xml
mapred-site.xml
core-site.xml
Finally worked out what the problem was. I ended up having to use dtruss -p on the pig/java process. This revealed a temporary directory and dynamically generated xml files. Once the temporary directory was discovered, it all fell quickly into place.
It was picking up the proxy excludes from my network connections, which had, as far as I can tell, &#2 (http://www.fileformat.info/info/unicode/char/02/index.htm) embedded in it. How this invalid value came to be in my network preferences in the first place, I haven't the faintest clue.
The value was then being pulled into dynamically generated files, for example /tmp/hadoop-vertis/mapred/staging/vertis-1005847898/.staging/job_local_0001/job.xml.
The offending lines:
<property><name>ftp.nonProxyHosts</name><value>localhost|*.localhost|127.0.0.1|h|*.h</value></property>
<property><name>socksNonProxyHosts</name><value>localhost|*.localhost|127.0.0.1|h|*.h</value></property>
<property><name>http.nonProxyHosts</name><value>localhost|*.localhost|127.0.0.1|h|*.h</value></property>