Imagine you want to produce a specific static asset for your Next.js web app. An image collage for example, or perhaps a web manifest or site map.
My current strategy for this kind of scenario is to make a script that can produce the desired output directly in my /public folder, but then I have to push the built file along with my source files in my repo, which is less than ideal.
Or, I have to set up a separate parallel asset pipeline to re-create the asset when the source files change and which I would launch whenever I launch next dev. I would also need to run the script when next build is called.
In either case, I then also need to ignore the built file so it's not pushed along with the other files in /public…
This kind of solution feels like I'm reinventing the wheel and losing the zero-config ideal around which the Next.js ecosystem revolves.
I'd much rather provide Next.js with my custom script and just hook into the existing asset pipeline, dev server lifecycle, and build script, letting Next.js do the heavy lifting for me, hence my question:
Does Next.js have a plugin architecture, extensions, or perhaps lifecycle hooks I could tap into to implement a custom asset pipeline?
I see some pluggable loaders exist for Next.js, such as the MDX loader, but it's not clear to me whether this is the way to go for what I'm trying to do, and I'm not finding any documentation about how to write my own loader, so I'm afraid this might not be a recommended approach…
How can I remove the web component part of a Flutter project? I have a project that when it was created was automatically created with web support. Now I am 100% certain the app won't be used for web and I just want to remove support for it. Is there a command in the terminal to help me do that?
I tried checking through the other questions but the others were asking how to disable web support altogether. I just want to remove web support on one project.
In order to fix this, I followed this link. The reason why I can't recreate the project easily is that the project has grown so much that it would be a big hassle to move it into another project. This happened after we updated all of our libraries to their latest versions respectively.
FirestoreWeb caused our project to stop compiling but since we aren't using Flutter web and we don't plan to use it for this project, it wouldn't make sense for us to implement it.
This answer was posted as an edit to the question Remove Web Part from Existing Flutter Project by the OP kobowo under CC BY-SA 4.0.
We use gradle as our build tool and use the idea plugin to be able to generate the project/module files. The process for a new developer on the project would look like this:
pull from source control.
run 'gradle idea'.
open idea and be able to develop without any further setup.
This all works nicely, but generally only gets exercised when a new developer joins or someone gets a new machine. I would really like to automate the testing of this more frequently in the same way we automate our unit/integration tests as part of our continuous integration process.
Does anyone know if this is possible and if there is any libraries for doing this kind of thing?
You can also substitue idea for eclipse as we have a similar process for those that prefer using eclipse.
The second step (with or without step one) is easy to smoke test (just execute the task as part of a CI build), the third one less so. However, if you are following best practices and regenerate IDEA files rather than committing them to source control, developers will likely perform both steps more or less regularly (e.g. every time a dependency changes).
As Peter noted, the real challenge is step #3. The first 2 ones are solved by your SCM plugin and gradle task. You could try automating the last task by doing something like this
identify the proper command line option, on your platform, that opens a specified intellij project from the command line
find a simple good enough scenario that could validate that the generated project is working as it should. E.g. make a clean then build. Make sure you can reproduce these steps using keyboard shortcuts only. Validation could be made by validating either produced artifacts or test result reports, etc
use an external library, like Robot, to program the starting of intellij and the running of your keyboards. Here's a simple example with Robot. Use a dynamic language with inbuilt console instead of pure Java for that, it will speed your scripting a lot...
Another idea would be to include a daemon plugin in intellij to pass back the commands from external CLI. Otherwise take contact with the intellij team, they may have something to ease your work here.
Notes:
beware of false negatives: any failure could be caused by external issues, like project instability. Try to make sure you only build from a validated working project...
beware of false positives: any assumption / unchecked result code could hide issues. Make sure you clean properly the workspace, installation, to have a repeatable state and standard scenario matching first use.
Final thoughts: while interesting from a theoretical angle, this automation exercise may not bring all the required results, i.e. the validation of the platform. Still it's an interesting learning experience and could serve as a material for a nice short talk, especially if you find out interesting stuff. Make it a beer challenger with your team when you have a few idle hours to try to see who can implement the fastest a working solution ;) Good luck!
I'm developing a web app.
If I include a jQuery plugin (or the jQuery file itself), this has to be put under my static directory, which is under SCM, to be served correctly.
Should I gitignore it, or add it, even if I don't plan on modifying anything from it?
And what about binary files (graphic resources) that might come with it?
Thanks in advance for any advice!
My view is that everything you need for your application to run correctly needs to be managed. This includes third-party code.
If you don't put it under SCM, how is it going to get deployed correctly on your production systems? If you have other ways of ensuring that, that's fine, but otherwise you run the risk that successful deployment is a matter of people remembering to do all the right things, rather than some automated low-risk "push the button" procedure.
If you don't manage it under SCM or something similar, how do you ensure that the versions you develop against and test against are the same? And that they're the same as production? Debugging an issue caused by a version difference you don't notice can be horrible.
I generally add external resources to my project directly. Doing so facilitates deployment and ensures that if someone changes the version of this file in your project, you have a clear audit history of what happened in case it causes issues in the code that you've written. Developers should know not to modify these external resources.
You could use something like git submodules, I suppose, but I haven't felt that this is worth the hassle in the past.
Binary files from external sources can be checked in to the project as well, although if they're extremely large you may want to consider a different approach.
There aren't a lot of reasons not to put external resources like jQuery into your repo:
If you pull it down from jQuery every time you check out or deploy, you have less control over which version you're using. This holds true for most third-party libraries; you probably don't want to upgrade your libraries without testing with your code to see if it breaks something.
You'll always have a complete copy of your site when you check out your repository and you won't need to go seeking resources that may have become unavailable.
For small (in terms of filesize) things like jQuery and images, I'd just add them unless you're really, really concerned about space.
It depends.
These arguments relate to having a copy of the library on your system and not pulling it from it's original location.
Arguments in favour:
It will ensure that everything needed for your project can be found in one place when someone else joins your development team. I've lost count of the number of times I've had to scramble around looking for the right versions of libraries in order to be able to get something working.
If you make any modifications to the library you can make these changes to the source controlled version so when a new version comes out you use the source control's merging tools to ensure your edits don't go missing.
Arguments against:
It could mean everyone has a copy of the library locally - unless you map the 3rd party tools to a central server.
Deploying could be problematical - again unless you map the 3rd party tools to a central server and don't include them in the deploy script.
Should libraries that the application relies on be stored in source control? One part of me says it should and another part say's no. It feels wrong to add a 20mb library that dwarfs the entire app just because you rely on a couple of functions from it (albeit rather heavily). Should you just store the jar/dll or maybe even the distributed zip/tar of the project?
What do other people do?
store everything you will need to build the project 10 years from now.I store the entire zip distribution of any library, just in case
Edit for 2017:
This answer did not age well:-). If you are still using something old like ant or make, the above still applies. If you use something more modern like maven or graddle (or Nuget on .net for example), with dependency management, you should be running a dependency management server, in addition to your version control server. As long as you have good backups of both, and your dependency management server does not delete old dependencies, you should be ok. For an example of a dependency management server, see for example Sonatype Nexus or JFrog Artifcatory, among many others.
As well as having third party libraries in your repository, it's worth doing it in such a way that makes it easy to track and merge in future updates to the library easily (for example, security fixes etc.). If you are using Subversion using a proper vendor branch is worthwhile.
If you know that it'd be a cold day in hell before you'll be modifying your third party's code then (as #Matt Sheppard said) an external makes sense and gives you the added benefit that it becomes very easy to switch up to the latest version of the library should security updates or a must-have new feature make that desirable.
Also, you can skip externals when updating your code base saving on the long slow load process should you need to.
#Stu Thompson mentions storing documentation etc. in source control. In bigger projects I've stored our entire "clients" folder in source control including invoices / bills/ meeting minutes / technical specifications etc. The whole shooting match. Although, ahem, do remember to store these in a SEPARATE repository from the one you'll be making available to: other developers; the client; your "browser source view"...cough... :)
Don't store the libraries; they're not strictly speaking part of your project and uselessy take up room in your revision control system. Do, however, use maven (or Ivy for ant builds) to keep track of what versions of external libraries your project uses. You should run a mirror of the repo within your organisation (that is backed up) to ensure you always have the dependencies under your control. This ought to give you the best of both worlds; external jars outside your project, but still reliably available and centrally accessible.
We store the libraries in source control because we want to be able to build a project by simply checking out the source code and running the build script. If you aren't able to get latest and build in one step then you're only going to run into problems later on.
never store your 3rd party binaries in source control. Source control systems are platforms that support concurrent file sharing, parallel work, merging efforts, and change history. Source control is not an FTP site for binaries. 3rd party assemblies are NOT source code; they change maybe twice per SDLC. The desire to be able to wipe your workspace clean, pull everything down from source control and build does not mean 3rd party assemblies need to be stuck in source control. You can use build scripts to control pulling 3rd party assemblies from a distribution server. If you are worried about controlling what branch/version of your application uses a particular 3rd party component, then you can control that through build scripts as well. People have mentioned Maven for Java, and you can do something similar with MSBuild for .Net.
I generally store them in the repository, but I do sympathise with your desire to keep the size down.
If you don't store them in the repository, the absolutely do need to be archived and versioned somehow, and your build system needs to know how to get them. Lots of people in Java world seem to use Maven for fetching dependencies automatically, but I've not used I, so I can't really recommend for or against it.
One good option might be to keep a separate repository of third party systems. If you're on Subversion, you could then use subversion's externals support to automatically check out the libraries form the other repository. Otherwise, I'd suggest keeping an internal Anonymous FTP (or similar) server which your build system can automatically fetch requirements from. Obviously you'll want to make sure you keep all the old versions of libraries, and have everything there backed up along with your repository.
What I have is an intranet Maven-like repository where all 3rd party libraries are stored (not only the libraries, but their respective source distribution with documentation, Javadoc and everything). The reason are the following:
why storing files that don't change into a system specifically designed to manage files that change?
it dramatically fasten the check-outs
each time I see "something.jar" stored under source control I ask "and which version is it?"
I put everything except the JDK and IDE in source control.
Tony's philosophy is sound. Don't forget database creation scripts and data structure update scripts. Before wikis came out, I used to even store our documentation in source control.
My preference is to store third party libraries in a dependency repository (Artifactory with Maven for example) rather than keeping them in Subversion.
Since third party libraries aren't managed or versioned like source code, it doesn't make a lot of sense to intermingle them. Remote developers also appreciate not having to download large libraries over a slow WPN link when they can get them more easily from any number of public repositories.
At a previous employer we stored everything necessary to build the application(s) in source control. Spinning up a new build machine was a matter of syncing with the source control and installing the necessary software.
Store third party libraries in source control so they are available if you check your code out to a new development environment. Any "includes" or build commands that you may have in build scripts should also reference these "local" copies.
As well as ensuring that third party code or libraries that you depend on are always available to you, it should also mean that code is (almost) ready to build on a fresh PC or user account when new developers join the team.
Store the libraries! The repository should be a snapshot of what is required to build a project at any moment in time. As the project requires different version of external libraries you will want to update / check in the newer versions of these libraries. That way you will be able to get all the right version to go with an old snapshot if you have to patch an older release etc.
Personally I have a dependancies folder as part of my projects and store referenced libraries in there.
I find this makes life easier as I work on a number of different projects, often with inter-depending parts that need the same version of a library meaning it's not always feasible to update to the latest version of a given library.
Having all dependancies used at compile time for each project means that a few years down the line when things have moved on, I can still build any part of a project without worrying about breaking other parts. Upgrading to a new version of a library is simply a case of replacing the file and rebuilding related components, not too difficult to manage if need be.
Having said that, I find most of the libraries I reference are relatively small weighing in at around a few hundred kb, rarely bigger, which makes it less of an issue for me to just stick them in source control.
Use git subprojects, and either reference from the 3rd party library's main git repository, or (if it doesn't have one) create a new git repository for each required library. There's nothing reason why you're limited to just one git repository, and I don't recommend you use somebody else's project as merely a directory in your own.
store everything you'll need to build the project, so you can check it out and build without doing anything.
(and, as someone who has experienced the pain - please keep a copy of everything needed to get the controls installed and working on a dev platform. I once got a project that could build - but without an installation file and reg keys, you couldn't make any alterations to the third-party control layout. That was a fun rewrite)
You have to store everything you need in order to build the project.
Furthermore different versions of your code may have different dependencies on 3rd parties.
You'll want to branch your code into maintenance version together with its 3rd party dependencies...
Personally what I have done and have so far liked the results is store libraries in a separate repository and then link to each library that I need in my other repositories through the use of the Subversion svn:externals feature. This works nice because I can keep versioned copies of most of our libraries (mainly managed .NET assemblies) in source control without them bulking up the size of our main source code repository at all. Having the assemblies stored in the repository in this fashion makes it so that the build server doesn't have to have them installed to make a build. I will say that getting a build to succeed in absence of Visual Studio being installed was quite a chore but now that we got it working we are happy with it.
Note that we don't currently use many commercial third-party control suites or that sort of thing much so we haven't run into licensing issues where it may be required to actually install an SDK on the build server but I can see where that could easily become a problem. Unfortunately I don't have a solution for that and will plan on addressing it when I first run into it.