I am new to lit specifically, and web components in general. I have built a web component for usage in Wix (as a custom element), just using straight vanilla js, and have found some success. See https://app.njaplatform.one/rating-element.js for a sample of what I built.
However, I have been asked to redo the work in Lit (https://lit.dev/). I like the syntax of the code better, and it feels like a smarter way to go (as a developer). However, I am really struggling with publishing it.
Whilst I have been able to rebuild my component in Lit, and it is working when I run a local "Web Dev Server", I am really uncertain as to how I publish my lit component so that it is available to other sites. I haven't found any great documentation, walkthroughs, or tutorials on this area. I am really struggling to get over this implementation hump (I am more of a front-end developer and can do some deployments, but I am often very dependent on documentation and well-established ways of doing things).
Key points:
I started with the typescript starter kit at https://lit.dev/docs/tools/starter-kits/
Running npm run serve serves the app to a 'Web Dev server' and that appears fine locally for testing.
I am not sure what I should be doing when I build/deploy the app.
I have made some attempts to publish on some services online, and I am getting a mix of failures, or partial successes which fashion some documentation on the component (generated by eleventity), but the actual javascript generated appears to break as either
1: file not founds (404), or, 2: Error resolving module specifier "Lit". Relative module specifiers must start with "./", "../", or "/"
Not sure how to proceed, would appreciate some "Lit 101" input on strategies as to what I should be doing at this point.
The starter kits are set up primarily to be a development bed for standalone components, and not exactly for web apps.
The errors you are getting about resolving module specifiers are due to lit being published to npm with bare module specifiers as lit and it's dependencies can be referenced by package name in the code, and deduplicated by npm. Running with Web Dev Server in the starter kit does not have this problem because it uses a plugin #rollup/plugin-node-resolve which resolves those bare specifiers to paths in node_modules directory.
The optimal thing to do depends on what your use case is.
If you are creating a web app/site, the common thing to do for production apps would be to create a JS bundle with your source code. Have an entrypoint file that imports all of your components. You can use something like rollup and also apply the aforementioned #rollup/plugin-node-resolve to create a single JS file. This should be added to your index.html, and have your server or static host serve them.
If you're creating reusable components, you could just leave your code as is and it would be up to the end user to take the steps above to bundle. This would be ideal since if the end user uses multiple components authored with Lit, they won't get duplicate copies of Lit.
If you want your user to be able to simply add a script tag for a js file and use your web components, you'd need to publish the bundled component, which you can also prepare with rollup. The down side of this is that each component will come with its own copy of Lit.
Related
Imagine you want to produce a specific static asset for your Next.js web app. An image collage for example, or perhaps a web manifest or site map.
My current strategy for this kind of scenario is to make a script that can produce the desired output directly in my /public folder, but then I have to push the built file along with my source files in my repo, which is less than ideal.
Or, I have to set up a separate parallel asset pipeline to re-create the asset when the source files change and which I would launch whenever I launch next dev. I would also need to run the script when next build is called.
In either case, I then also need to ignore the built file so it's not pushed along with the other files in /public…
This kind of solution feels like I'm reinventing the wheel and losing the zero-config ideal around which the Next.js ecosystem revolves.
I'd much rather provide Next.js with my custom script and just hook into the existing asset pipeline, dev server lifecycle, and build script, letting Next.js do the heavy lifting for me, hence my question:
Does Next.js have a plugin architecture, extensions, or perhaps lifecycle hooks I could tap into to implement a custom asset pipeline?
I see some pluggable loaders exist for Next.js, such as the MDX loader, but it's not clear to me whether this is the way to go for what I'm trying to do, and I'm not finding any documentation about how to write my own loader, so I'm afraid this might not be a recommended approach…
I am using this code: https://github.com/pradeepramakrishna/Lightning-Experience/tree/master/aura-components to compile on my org.
I have got one components as eventLib.lib and interactive.js in eventLib folder. I have put this folder in c:MyDevOrg\force-app\main\default\aura folder in the local machine and tried to deploy in the org using VSCode but it didn't work.
I tried creating folder in the Files tab and added the interactive.js in the folder but that is also not seeming to work with the Aura components.
How can I deploy this lib into the org as it is being used in some Aura components.
We don't know what you're trying to achieve. You asked a very technical question to which the answer is "you're probably doing it wrong". Instead try to write what business functionality you are trying to achieve and you might get better answers.
You shouldn't have to import ui:eventLib. It's supposed to be part of core Salesforce's Aura components framework. But.
You've referenced a repo that wasn't updated in a while, no promises this still compiles / is best way to do X.
This repo seems to rely on open source Aura framework which is well... dead in the water.
You might be able to reuse something from this repo in your app using SF's built-in tags - but whole ui: library has been deprecated. Announced in Winter'20 = almost 2 years ago, finally dead in May 2021.
I don't think eventLib was ever exposed, might be something needed just in open source version which complicates the matter more. The answer would be to not work with the really "decompiled" tags but with their higher abstracted versions like ui:inputDate.
So, back to my question. What exactly are you trying to achieve.
You want to build something on pure SF platform (use pure Aura/LWC).
Want to have an app written in Angular, React etc, pure JS, connecting to SF data via API? (build it, upload as static resourcethen import using lightning:container)
want to prettify an existing Java/PHP/.NET app, make it look more Lightning-ish and embed it as iframe? (look into https://www.lightningdesignsystem.com/ and connected apps + "canvas")
want to expose piece of SF as reusable element that can be embedded in another website but could even be an Outlook plugin? (search for "lightning out")
want to look at modern equivalent of that old open source Aura repo and decide what to do next? Check https://lwc.dev/ out.
I am new for ReactJS. Should I go with JSXTransformer or Babel for writing ReactJS code?
I understand that ReactJS implementation is not depend on JSXTransformer / babel. We can use ReactJS without these two too, but still I want about this with respect to ReactJS.
This is opinion based so should probably be closed.
Pretty sure the React team have deprecated the use of the JSX Transformer (outside of in-browser development usage) in favour of Babel. Babel now supports everything that React needs (and more) in a convenient and standard way and should be considered the preferred method of JSX transformation.
More information on React tooling can helpfully be found at their tooling page.
Matt Styles is right, it's beeing deprecated:
from here
JSXTransformer is another tool we built specifically for consuming JSX
in the browser. It was always intended as a quick way to prototype
code before setting up a build process. It would look for
tags with type="text/jsx" and then transform and run. This ran the
same code that react-tools ran on the server. Babel ships with a
nearly identical tool, which has already been integrated into JS Bin.
We'll be deprecating JSXTransformer, however the current version will
still be available from various CDNs and Bower.
However it is great to learn the basic of react, focusing on component methods, passing props, etc.. with a easy integration.
But i believe you won't be able to require any node modules, wich will block you soon or later.
To learn React and the node environnement, I suggest you to make a few tutorials, and to test and read the code of simple boilerplates project like these:
react-hot-boilerplate
react-transform-boilerplate
As of Play 2.0, it appears that there is no longer a way of creating a module for an existing Play application, other than by creating a new Play application. Having searched around a bit, I came across these instructions, which indicate that I must (or at least should) delete any routes created in the new module (application), and that the module's application.conf file is really just a stub that is required in order for the module to be recognized as a Play application.
I am new to Play, but apparently there used to be a console command ('new-module') for generating a module, which presumably created only those files which were needed for the module to be discovered by the application. It seems to me like it would still be useful to be able to quickly create a new module in this way, especially if registering a new module from the console also added the module to your build, and to the repository of your choice, thus removing the requirement for (as much) manual wiring.
I would also like to be able to maintain Play modules upon which my application depends as part of the same codebase/build, such that, when I make changes to a module, they are picked up at application compile time (for example, when play is ~ running and and a changed file is saved). Does this already happen with modules registered as dependencies, or must I rebuild modules independently of my application?
Because I am a newcomer, I'm not positive that there isn't a way to do accomplish these tasks in an automated manner. There is a chapter on packages and modules listed in the Play for Scala book (Part III, chapter 9), but the book is not yet complete and that chapter is, unfortunately, yet unwritten.
If an experienced Play! developer would be so kind as to either confirm that the instructions to which I linked above are still the recommended procedure for creating a module and registering/maintaining it as a dependency, or else list a better procedure, I would greatly appreciate it.
Most of the information is valid.
To Play 2.x there is no difference between a regular library and a Play module (a library which itself depends on the play library jar).
The part about the routes file is still valid, but they introduced 'Sub-Router composition' to give you some extra freedom (search for 'Allow more modularization for your projects' on the highlight page).
Libraries (and thus Play modules) are referenced in the Build.scala file with version, for example:
"play.modules.mailer" %% "play-mailer" % "1.1.0"
If you are developing a module yourself you could use the 'publish-local' command to make sure other projects on your computer can find the dependency. Because modules are essentially versioned libraries you need to compile them separately from your application. However no-one is preventing you from running scripts to automate things.
I'm developing a web app.
If I include a jQuery plugin (or the jQuery file itself), this has to be put under my static directory, which is under SCM, to be served correctly.
Should I gitignore it, or add it, even if I don't plan on modifying anything from it?
And what about binary files (graphic resources) that might come with it?
Thanks in advance for any advice!
My view is that everything you need for your application to run correctly needs to be managed. This includes third-party code.
If you don't put it under SCM, how is it going to get deployed correctly on your production systems? If you have other ways of ensuring that, that's fine, but otherwise you run the risk that successful deployment is a matter of people remembering to do all the right things, rather than some automated low-risk "push the button" procedure.
If you don't manage it under SCM or something similar, how do you ensure that the versions you develop against and test against are the same? And that they're the same as production? Debugging an issue caused by a version difference you don't notice can be horrible.
I generally add external resources to my project directly. Doing so facilitates deployment and ensures that if someone changes the version of this file in your project, you have a clear audit history of what happened in case it causes issues in the code that you've written. Developers should know not to modify these external resources.
You could use something like git submodules, I suppose, but I haven't felt that this is worth the hassle in the past.
Binary files from external sources can be checked in to the project as well, although if they're extremely large you may want to consider a different approach.
There aren't a lot of reasons not to put external resources like jQuery into your repo:
If you pull it down from jQuery every time you check out or deploy, you have less control over which version you're using. This holds true for most third-party libraries; you probably don't want to upgrade your libraries without testing with your code to see if it breaks something.
You'll always have a complete copy of your site when you check out your repository and you won't need to go seeking resources that may have become unavailable.
For small (in terms of filesize) things like jQuery and images, I'd just add them unless you're really, really concerned about space.
It depends.
These arguments relate to having a copy of the library on your system and not pulling it from it's original location.
Arguments in favour:
It will ensure that everything needed for your project can be found in one place when someone else joins your development team. I've lost count of the number of times I've had to scramble around looking for the right versions of libraries in order to be able to get something working.
If you make any modifications to the library you can make these changes to the source controlled version so when a new version comes out you use the source control's merging tools to ensure your edits don't go missing.
Arguments against:
It could mean everyone has a copy of the library locally - unless you map the 3rd party tools to a central server.
Deploying could be problematical - again unless you map the 3rd party tools to a central server and don't include them in the deploy script.