Error saving Eclipse project to Google Drive - google-plugin-eclipse

I have a newly-created Apps Script project in Drive which I successfully imported into Eclipse via the Google Plugin.
I made some edits and hit Save, and I get this:
Error saving Eclipse project TestProject to Drive (file ID: 18Xhe...)
com.google.gdt.eclipse.drive.driveapi.DriveWritingException: Server error while storing project in Drive.
at com.google.gdt.eclipse.drive.driveapi.DriveServiceFacade.writeProject(DriveServiceFacade.java:222)
at com.google.gdt.eclipse.drive.DriveEclipseProjectMediator.performDriveUpdate(DriveEclipseProjectMediator.java:446)
at com.google.gdt.eclipse.drive.DriveEclipseProjectMediator$2.run(DriveEclipseProjectMediator.java:406)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 500 Internal Server Error
{
"code" : 500,
"errors" : [ {
"domain" : "global",
"message" : "Internal Error",
"reason" : "internalError"
} ],
"message" : "Internal Error"
}
at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:145)
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:423)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:343)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:460)
at com.google.gdt.eclipse.drive.driveapi.DriveServiceFacade.writeProject(DriveServiceFacade.java:219)
... 5 more
Any ideas?
Thanks in advance,
Jeffery

I see the same error message as well. The problem started occurring a few hours ago.
I've reported this at https://code.google.com/p/google-plugin-for-eclipse/issues/detail?id=284&thanks=284&ts=1401349699

Related

Setup Apereo Cas Management integrated with CAS server

I want to install Apero Cas Management (verison 6.0) and integrate it with Cas Server (version 6.0).
I have installed following these step:
Step 1: I installed Cas Server
I checked it with REST API. It worked.
My server stays at http://203.162.141.7:8080
And this is configuration of my Cas server. I put this config at /etc/cas/config. Here is my file cas.properties file
cas.server.name=http://203.162.141.7:8080
cas.server.prefix=${cas.server.name}/cas
logging.config: file:/etc/cas/config/log4j2.xml
server.port=8080
server.ssl.enabled=false
cas.serviceRegistry.initFromJson=false
cas.serviceRegistry.json.location=file:/etc/cas/services-repo
cas.authn.oauth.grants.resourceOwner.requireServiceHeader=true
cas.authn.oauth.userProfileViewType=NESTED
cas.authn.policy.requiredHandlerAuthenticationPolicyEnabled=false
cas.authn.attributeRepository.stub.attributes.email=casuser#example.org
#REST API JSON
cas.rest.attributeName=email
cas.rest.attributeValue=.+example.*
Step 2: I installed Cas-management-overlay
I put my cas-management-overlay's config file a /etc/cas/config too. Here is my management.properties file
cas.server.name=http://203.162.141.7:8080
cas.server.prefix=${cas.server.name}/cas
mgmt.serverName=http://203.162.141.7:8088
mgmt.adminRoles[0]=ROLE_ADMIN
mgmt.userPropertiesFile=file:/etc/cas/config/users.json
server.port=8088
server.ssl.enabled=false
logging.config=file:/etc/cas/config/log4j2-management.xml
And my here is users.json file
{
"casuser" : {
"#class" : "org.apereo.cas.mgmt.authz.json.UserAuthorizationDefinition",
"roles" : [ "ROLE_ADMIN" ]
}
}
Then I run ./build.sh, and it shows me that
Finally, I access this link to open cas-management http://203.162.141.7:8088/cas-management, but the it redirects to this url http://203.162.141.7:8080/cas/login?service=http%3A%2F%2F203.162.141.7%3A8088%2Fcas-management%2F and shows this error below
I don't know where I have gone wrong.
I think since you haven't told the management webapp about the location of the service registry, it can't add itself as a registered service.
Manually add a registered service for http://203.162.141.7:8088/cas-management and you should be able to log in to the management app at that point.
Here is my answer for cas-management register file name /etc/cas/services-repo/casManagement-1.json
{
"#class" : "org.apereo.cas.services.RegexRegisteredService",
"serviceId":"^https://domain:8088/cas-management.+",
"name" : "casManagement",
"id" : 1,
"evaluationOrder" : 1,
"allowedAttributes":["cn","mail"]
}

Can anyone help me with this error code in Data Fusion

I'm having a go at creating my first data fusion pipeline.
The data is going from Google Cloud Storage csv file to Big Query.
I have created the pipeline and carried out a preview run which was successful but after deployment trying to run resulted in error.
I pretty much accepted all the default settings apart from obviously configuring my source and destination.
Error from Log ...
com.google.api.client.googleapis.json.GoogleJsonResponseException: 403
Forbidden
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Required 'compute.firewalls.list' permission for
'projects/xxxxxxxxxxx'",
"reason" : "forbidden"
} ],
"message" : "Required 'compute.firewalls.list' permission for
'projects/xxxxxxxxxx'"
}
After deployment run fails
Do note that as a part of creating an instance, you must set up permissions [0]. The role "Cloud Data Fusion API Service Agent" must be granted to the exact service account, as specified in that document, which has an email address that begins with "cloud-datafusion-management-sa#...".
Doing so should resolve your issue.
[0] : https://cloud.google.com/data-fusion/docs/how-to/create-instance#setting_up_permissions

Unable to add user properties in LDAP Server using Apache Directory Studio on my eclipse using LDAP browser

I have configured LDAP Server From Apache Directory Studio on my eclipse. I have created user and set the password as well.
Now I have to add the user properties and their values from eclipse- LDAP Browser like userName,userFaxNumber,userPhoneNo,userDesignation....
Please find the attached step:
while trying to add it I am getting below error:
Error while executing LDIF - [LDAP: error code 16 - NO_SUCH_ATTRIBUTE:
failed for MessageType : MODIFY_REQUES java.lang.Exception:
[LDAP:
error code 16 - NO_SUCH_ATTRIBUTE: failed for
MessageType : MODIFY_REQUEST Message ID : 20 Modify Request Object :
'uid=user1,ou=system' Modification[0] Operation : add Modification
userFaxNumber:
222222org.apache.directory.api.ldap.model.message.ModifyRequestImpl#a224a2db:
ERR_04269 ATTRIBUTE_TYPE for OID userfaxnumber does not exist!] at
org.apache.directory.studio.connection.core.io.api.DirectoryApiConnectionWrapper.checkResponse(DirectoryApiConnectionWrapper.java:1374)
at
org.apache.directory.studio.connection.core.io.api.DirectoryApiConnectionWrapper.access$9(DirectoryApiConnectionWrapper.java:1342)
at
org.apache.directory.studio.connection.core.io.api.DirectoryApiConnectionWrapper$4.run(DirectoryApiConnectionWrapper.java:736)
at
org.apache.directory.studio.connection.core.io.api.DirectoryApiConnectionWrapper.runAndMonitor(DirectoryApiConnectionWrapper.java:1269)
at
org.apache.directory.studio.connection.core.io.api.DirectoryApiConnectionWrapper.checkConnectionAndRunAndMonitor(DirectoryApiConnectionWrapper.java:1205)
at
org.apache.directory.studio.connection.core.io.api.DirectoryApiConnectionWrapper.modifyEntry(DirectoryApiConnectionWrapper.java:758)
at
org.apache.directory.studio.ldapbrowser.core.jobs.ImportLdifRunnable.importLdifRecord(ImportLdifRunnable.java:515)
at
org.apache.directory.studio.ldapbrowser.core.jobs.ImportLdifRunnable.importLdif(ImportLdifRunnable.java:272)
at
org.apache.directory.studio.ldapbrowser.core.jobs.ExecuteLdifRunnable.executeLdif(ExecuteLdifRunnable.java:157)
at
org.apache.directory.studio.ldapbrowser.core.jobs.ExecuteLdifRunnable.run(ExecuteLdifRunnable.java:123)
at
org.apache.directory.studio.ldapbrowser.core.jobs.UpdateEntryRunnable.run(UpdateEntryRunnable.java:59)
at
org.apache.directory.studio.connection.ui.RunnableContextRunner$1.run(RunnableContextRunner.java:116)
at
org.eclipse.jface.operation.ModalContext$ModalContextThread.run(ModalContext.java:119)
[LDAP: error code 16 - NO_SUCH_ATTRIBUTE: failed for MessageType :
MODIFY_REQUEST Message ID : 20 Modify Request Object :
'uid=gridcomm22,ou=system' Modification[0] Operation : add
Modification rlSubmitterFaxNumber:
222222org.apache.directory.api.ldap.model.message.ModifyRequestImpl#a224a2db:
ERR_04269 ATTRIBUTE_TYPE for OID userfaxnumber does not exist!]
Please help.. Thanks in advance..

XS Project Share SAP HANA cannot see in browser

I have project with XS project, I already shared to HANA packages but failed when show to browser, the error show:
404 - Not found
We could not find the resource you're trying to access.
It might be misspelled or currently unavailable.
My .xsaccess:
{
"exposed" : true,
"authentication" : [{"method":"Basic"}],
"cache_control" : "no-cache, no-store",
"cors" : {
"enabled" : false
}
}
.xsapp:
{}
xsprivileges:
{
"privileges" : [
{ "name" : "ProfileOwner", "description" : "Profile Ownership" }
]
}
and one question, is it possible the problem because the role user or privileges user, about authorization? How to fix this issue? thanks
The .xsapp should be a empty file with no content in it. The exposed parameter in the .xsaccess should be enough to expose your project. Make sure that all files are activated in the HANA repository.
If the error was authorization specific you would get a 503 error. If the 404 error is a XSEngine page, either your code isn't activated or the package path is incorrect.

Can log4js-node run in a karma-browserify set up?

I am trying to learn Node test-driven frameworks.
I figured I should include a logging system, but can't seem to get it to work.
yourself#BDD0:~/BDD/simplest$ npm run test-browser
> simplest#1.0.0 test-browser /home/yourself/BDD/simplest
> ./node_modules/karma/bin/karma start
INFO [framework.browserify]: registering rebuild (autoWatch=true)
INFO [karma]: Karma v0.12.28 server started at http://localhost:9876/
INFO [launcher]: Starting browser Chrome
INFO [Chrome 39.0.2171 (Linux)]: Connected on socket p0GQRPZBeIOsd2Uz1e0p with id 97681222
INFO [framework.browserify]: 103086 bytes written (2.52 seconds)
INFO [framework.browserify]: bundle built
Chrome 39.0.2171 (Linux) ERROR
Uncaught Error: Problem reading log4js config { appenders: [ { type: 'console' } ], replaceConsole: false }.
Error was "Cannot find module 'console'" (Error: Cannot find module 'console'
at s (/tmp/9e6dc093e0e34f105c98657867f51cb8bdd77edf.browserify:1:156)
: :
: :
I'm hoping someone recognizes the error and can tell me what it means.
The Chrome browser has a console. I have nodejs-console in my dependencies.
Am I trying something that log4js cannot do?
Thanx.
Ok I got it to work.
I found it necessary to clone the log4js project and edit log4js.js, in order to add a single line :
require('./appenders/console');
I have posted a Pull Request for my solution : Force bundling of appenders/console
I have created a little demo that shows it in action : javascript-bdd-baby-steps
Possibly my pull request will be rejected with a description of the correct solution. If so I will update here.
I hope this helps someone somewhere someday.
Update 2015/04/20:
My PR was accepted.