Is there a way to import a Bamboo Spec file via the REST API? - rest

I currently try to automate the process of creating a new Bamboo linked repository and start the scan. I've already looked over the documentation of the REST API, tried to generate a new plan and enabling a scan, but that didn't work.
I also tried the Java Maven Package from Atlassian, but that needs user credentials as an authentication method, whilst I need to authenticate via a Security token. There is a link to an API in that Maven Package, which I tried to send a request to with the yaml code, but it always responds with the status code 500 and a Java Stacktrace. It's probably due to a wrong request body, but I can't figure out, how to include the yaml content the same way, as the maven package.
Is there a way to create a linked repository via the REST API?
Thanks in advance!

Is there a way to create a linked repository via the REST API?
No and there won't ever be one because they're deprecating the Bamboo server in favour of their cloud-based alternative (which is based on a totally different API). See https://jira.atlassian.com/browse/BAM-18453
Java Maven Package from Atlassian
What package is that? Based on what I said earlier the only way for you to programmatically create a linked repo is to mimic the browser POST request to updateLinkedRepository.action. That means that you'll need to login first to get a JSESSIONID cookie (xsrf token can be disabled, see https://confluence.atlassian.com/bamkb/rest-api-calls-fail-due-to-missing-xsrf-token-899447048.html#RESTAPIcallsfailduetoMissingXSRFToken-Workaround). Ping me if you need help, I still have ansible code for the login part.

Related

Can I get the Swagger interface to appear on a deployed Azure web API?

When one creates an ASP.NET Core Web API in Visual Studio 2022, and tests it locally, one gets a convenient Swagger page built upon an OpenAPI definition, to test all HTTP endpoints.
However, when deployed and trying to access {path-to-api}/swagger, it returns a 404 Not Found error, even while on localhost, when both the API and the database is sitting on my own machine. Even if the database is in the Azure cloud, for that matter, it also works, if I put the Azure SQL Database connection string into appsettings.json.
So is there a way to achieve this, preferably without too much hassle? Or am I wrong in wanting this, do developers mostly test their APIs locally? Because I want the Swagger API online only for testing.
The problem is getting and using the swagger functionality into the cloud. Is it possible and good practice?
If you look at the startup, you will notice that the swagger is only loaded during a development session via an if check. Commenting that out, or expanding it based on evironment, will allow a published version to generate the page on the target host.
if (app.Environment.IsDevelopment())
{
app.UseSwagger();
app.UseSwaggerUI();
}
I generally do that for first publishes or to Dev/Test environments to see it running. Once it is not needed, I un-comment it back in.
Also it may be actually viable (turned on) in Dev or UAT server because one is also publishing the open api it to APIM (Azure api manager), which takes the api and generates its own development environment; away from an initial publish.
Also once published, it is not the default page, one still has to path to it such as .../swagger/index.html.
I'm aborting this mission to deploy the Swagger interface to Azure along with my API. It's bad security practice to make the HTTP request methods so visually available to all. So the answer to my question do developers mostly test their APIs locally, is apparently yes.
I wondered if I should remove the question, but I would like to make it still stand, in case anyone else is contemplating about doing the same thing - exposing an API online with the Swagger UI.

AWS SSO integration with G suite

I want to make use of AWS SSO and integrate it to work with G suite.
I followed the official blog post - https://aws.amazon.com/blogs/security/how-to-use-g-suite-as-external-identity-provider-aws-sso/
However, I'm unable to perform the user synchronization from G suite into AWS SSO via the mentioned ssosync project - https://github.com/awslabs/ssosync. There's an open issue regarding the fact that ssosync is no longer available in AWS Serverless Application Repository. I've tried to clone and build the project manually but I get a 404 error and I can't find a reason why.
I am also unable to find a way to create users/groups programmatically (didn't find anything useful in AWS SSO API reference) in AWS SSO.
Has anyone encountered this problem as well?
I think that does not work anymore. What about using this one instead?
https://github.com/awslabs/ssosync was updated to V.2.0.0 few days ago (Dec 2022).
I installed it from AWS Serverless Application Repository and it seems to work.
It requires that you configure every possible variable before successful execution. For variables that you don't wish to use, put *.

REST endpoint registration and bootstrap(Creating range-index) using U Deploy

I have my code in Git repository. I am using UDeploy to deploy my code into MarkLogic environment. I can able to move all my modules successfully but facing two problems
1. Creating New indexes
2. REST endpoint creation
Please let me know if there is anyway to implement these two
For creating indexes, I have tried to do it using API functions(admin:database-range-element-index()) and I have successful in that part. But is there any way to do it from UDeploy or DevOps.
For register REST endpoint I couldn't able to find anyway to try.
Have you looked at MarkLogic's REST Management APIs - https://docs.marklogic.com/REST/management. In particular, see if https://docs.marklogic.com/REST/POST/manage/v2/databases will help you create indexes via REST management APIs.
The most common way to deploy MarkLogic code & configuration is ml-gradle, a plugin to the widely used gradle tool. ml-gradle uses MarkLogic's Management API, mentioned by Ganesh, and is scriptable.

Unable to integrate CQ5.6.1 with Site Catalyst

I'm having difficulty in integrating AEM 5.6.1 with Site Catalyst. It allows me to connect in the configuration successfully, but does not work on the framework setup.
I've followed the standard procedure to connect AEM to SC and it accepts my login in the configuration, but fails on the framework set up with the browser message 'We were not able to login to SiteCatalyst. Please check your credentials and try again.'. Behind the scenes in the server log;
12.12.2014 14:10:06.967 *WARN* [0:0:0:0:0:0:0:1 [1418393406764] POST /libs/cq/analytics/sitecatalyst/service.json HTTP/1.1] com.day.cq.analytics.sitecatalyst.impl.SitecatalystHttpClientImpl Data center 'https://api3.omniture.com/admin/1.3/rest/' responded with errors {"error":{"code":500,"message":"Internal Server Error"}}
12.12.2014 14:10:06.967 *ERROR* [0:0:0:0:0:0:0:1 [1418393406764] POST /libs/cq/analytics/sitecatalyst/service.json HTTP/1.1] com.day.cq.analytics.sitecatalyst.impl.servlets.SitecatalystServlet Call to SiteCatalyst method 'Company.GetReportSuites' failed com.day.cq.analytics.sitecatalyst.SitecatalystException: not authenticated
I've tried accessing via the API Explorer and it works.
I've tried the troubleshooting guide without success.
I can log in to Site Catalyst, I'm an admin, I am in the web services access group.
I've tried using a clean install of CQ5.6.1 with geometrixx - it doesn't work either.
I've tried this from a server and from a localhost/dev machine with the same results. No proxy. I've even tried using the shared secret as the password but then it doesn't connect at all, and fails on the configuration screen.
What might cause this to fail?
If it doesn't work with a fresh install and Geometrixx, then it's probably an Adobe bug. That's typically the first thing support will ask you about.
I would also verify using Geometrixx Outdoors, or a more recent demo site, on your fresh install, just to ensure it's not an outdated ClientLib issue.
I know this isn't a direct answer to your question, but honestly, I would approach the integration differently. I've worked with the AEM-SC framework and it's buggy at best. It's very finicky, it doesn't REALLY work the way the documentation claims, and it requires that you're very specific about what Clientlibs are on the page.
Moving forward, I think using Adobe Dynamic Tag Manager is the better approach, for many reasons. My understanding is that it's Adobe's recommendation as well. I'd consider moving to that. In AEM 5.6.1, you'll have to customize your integration with DTM, but it's not very hard.
Solution: Add a property on the configuration node for sitecatalyst: (eg. /etc/cloudservices/sitecatalyst/my-sc-configuration)
server=https://api.omniture.com/admin/1.2/rest/
it also seems to work with newer API versions such as https://api3.omniture.com/admin/1.3/rest/
It would appear that for 5.6.1 it ignores the OSGi configuration, at least for the configuration screens. With this extra property, the framework page loads without error and allows selection of the RSID.

Automated deployment of web site

I'm planning to do an automated deployment of a website,but im kind of stuck at this moment. I have looked at MS-Deploy, it got all the functions for deploying Website. I have a created a Web application package (.ZIP file) and I tested this on my local machine it is deploying website i.e
Create Web application under default website
Publishing files in c:\inetpub\wwwroot directory
Set ACLs on directories,etc
But i want to achieve few more extra steps for example:
Check whether Web application exists in Default Website, if not
create a Web application
Check whether Application pool exists, if not create a App pool
(given name) with a specific credentials and Assign App-pool to Web
application
Before it deploys take a backup copy of existing Web application (IF
exists)
publish offline page (app_offline.htm)
publishing the files to application directory
Replace the AppSettings section(in web.config file) to with actual values
Encrypt Web.config connection string
If there is any error whilst installing web application, rollback the web application to its previous version
The question is whether can i achieve all these functions via MS-Deploy or do i need to write any script, please suggest me what scripting language should i use
Please let me know if you need more information.
Thanks in advance
I'm not an expert on this topic but have been doing a bit of research on automated deployment with MSDeploy lately, and think I can offer the following;
This is default behaviour if you use the iisApp provider.
I know you can do this with the appPoolConfig provider, but I'm unsure as to how you would run this and #1 together as part of the same package. Perhaps as part of a pre- or post-sync command?
This is standard in v3, as long as it's set up on the server. Not used it myself, but read this anyway.
Fiddly. Not supported in MSDeploy, but you can vote for it if you want. Also, check out this SO answer (and also worth checking out PackageWeb, but the same answers' author).
Not sure I follow. This is done as part of a successful deployment, surely?
Use web.config transforms and optionally the aforementioned PackageWeb for a neat way to do this. Also check out Web Publish Profiles.
Difficult. My understanding is that the encryption is based on the machine.config, so you'd either have to run a post-sync script which would run some sort of remote Powershell script on the remote server to encrypt the web.config using aspnet_regiis, or you'd have to encrypt the config as part of your build process and then muck about with custom keys and the RSA provider (some info here).
I hope that helps. As I said, I'm no expert, so happy to be corrected by those more knowledgeable. Maybe also worth mentioning that MSDeploy is a lot more powerful if you use it via the command-line rather than creating packages from VS, although there is a bit of a learning curve to go with it.