As I wrote in my last post, I had a little bit of a naming conflict with another project called Bundler. Well, I ran a contest and the results were neck and neck, but the name suggested by my good friend Harper won out. And that name is SquishIt.
Unfortunately, when you delete a repository on GitHub, it doesn’t setup any redirects or anything and it breaks everyone who has pulled down the repository until they repoint their local repositories at the new location. I apologize for this inconvenience, but it is better to get it out of the way now than to wait until the project has even more followers.
You can go find SquishIt at its new home. Please note that the code itself has not yet been updated, but this is going to occur soon to reflect the new name.
Thanks for everyone’s support!
Loved the article? Hated it? Didn’t even read it?
We’d love to hear from you.
The link to the new home points to your "Help Me Rename Bundler" post.
Squishit is a fantastic name too, love it 🙂
The link to the new home points to your "Help Me Rename Bundler" post.
Love the new name, even better than Combinatron (certainly less of a mouthful)
What’s a Squi? 😉 Reminds me of the unfortunately named UnleashIt.
Oh no, you named it SquiShit. 🙂
@Alex Thanks! The link has been updated.
@AC @Judah Yep, I was aware of that when I named it. I don’t think it is a big deal 🙂
I think you should rename it again and actually make it "SquishShit"
@Josh I actually prefer a little bit of subtlety, but thanks for the suggestion 🙂
Wow, this is a great project, thanks! I just had a couple of questions. Would there be any benefit to adding support for remote files with url paths? I’m thinking of the jquery files hosted on google’s cdn. Also, I don’t know if it would be worth fixing this, but I noticed that the relative paths for css files hosted in sub-directories are broken when they are squished.
@Walt Originally I wanted to pull in references from URLs, and the code is even still in the project, but the more I think about it, the more I realize that it is probably a bad idea. If you are pulling JavaScript from an external source, what happens if the Google CDN is down? You could get in a situation where your JavaScript is going to be recompressed, but you get an exception. Seems like it could cause issues, for relatively little gain.
I’ll look into the bug with css files hosted in sub directories though, I wasn’t aware that this was happening.
Yes, that makes sense. Also, you would lose the potential benefit of having the cdn file already cached on the client.
I don’t think the path issue in css files is a bug, it’s just when you move the output to a different directory. Like pulling from /css/sub and outputting to /css, obviously relative images paths just don’t work.
SquishIt is sweet, already one of my required tools.
@Walt I think I just realized what you were talking about, since I went to fix your issue… are you saying that relative paths inside of css files (for example in image references) are broken when a css file is hosted in a sub-directory? I’m not sure I am going to try to fix this, since that would require parsing the css files. This is going to be a known limitation since you are moving your css files when they are being compressed. One solution would be to compress each folder separately, or just host everything in the same folder. Sorry!
I’m flattered.