Is there a CDN or static hosting to place image fi...

Created on: March 27, 2026

Answered using GPT-5.4 Thinking by Chat01

Question

Is there a CDN or static hosting to place image files in Gitlab so I can load them in my web apps ?

Thought

Final answer on GitLab Pages for static hosting

The best option is GitLab Pages, which supports static file hosting with HTML, CSS, JS, images, and custom domains/HTTPS. For GitLab.com, you can use the default gitlab.io domain or a custom one. While raw repository endpoints can serve public files, they are limited by rate limits and base64 encoding. Self-managed instances allow configuring external storage/CDN, but that's an admin-level setting. GitLab's built-in setup doesn’t exactly function like a CDN on GitLab.com.

Thought for 45s

Answer

Yes. The supported GitLab way to host images for use in web apps is GitLab Pages. GitLab Pages publishes static files from a repository, supports plain static assets, runs on GitLab infrastructure, and can use custom domains plus HTTPS. By default it serves content from a public/ folder, or another folder you set with pages.publish. (GitLab Docs)

So for something like app images, a common setup is:

  • create a small GitLab project
  • put files under public/images/...
  • deploy with Pages
  • load them in your app from your Pages URL, such as https://<namespace>.gitlab.io/<project>/images/logo.png or from your custom domain if you attach one. GitLab documents default Pages domains and URL patterns for project and group sites. (GitLab Docs)

What I would not use for production image hosting is the repository file API or raw repository URLs. GitLab’s repository file/blob endpoints are primarily repository access mechanisms, not asset hosting, and large files on those endpoints are rate-limited. (GitLab Docs)

If you are on self-managed GitLab, there is also an admin-level option to put repository static objects like raw blobs and archives behind external storage such as a CDN, but that is an instance configuration, not something you normally enable per project. (GitLab Docs)

So the practical answer is:

  • GitLab.com: use GitLab Pages for static image hosting
  • Self-managed: use GitLab Pages, or ask your admin about CDN-backed static objects if you specifically need that infrastructure feature (GitLab Docs)

A very small .gitlab-ci.yml for image hosting via Pages can look like this:

yaml
pages: script: - mkdir -p public - cp -r images public/images pages: true artifacts: paths: - public

Then reference:

html
<img src="https://<namespace>.gitlab.io/<project>/images/logo.png" alt="logo">

If you want, I can give you the cleanest setup for either GitLab.com or self-managed GitLab.

Question

If I use Heroku to deploy my API were I need to get images, how to do it

Thought

Explaining how to get images for your Heroku API

The user likely wants to deploy an API on Heroku and get images. Here are some possible solutions:

  1. Use public static images hosted on GitLab Pages or a CDN.
  2. Store user-generated images in S3 or Cloudinary, not on the filesystem.
  3. Fetch images by URL and stream them in your app.

Keep in mind that Heroku’s filesystem is ephemeral and resets daily, so you'll need an external storage solution like the ones mentioned above.

Providing image storage options for Heroku API

For public images, GitLab Pages works well—store URLs in a database, and the frontend loads them directly. For private or user-uploaded images, S3 or Cloudinary are good options, with Heroku storing only the URL or key. If you need to transform or authorize images, use an API to proxy the image stream.

Thought for 30s

Answer

Yes — if your API runs on Heroku, the main rule is:

Do not store runtime images on the Heroku dyno filesystem. Heroku dynos have an ephemeral filesystem: files written at runtime are not shared across dynos and are discarded on restart/redeploy. Heroku explicitly recommends using external storage like S3 for static assets and uploads. (devcenter.heroku.com)

The cleanest setups are:

1) Public images: host them separately, return URLs from your API

For product images, logos, article thumbnails, and other public assets, keep the images on a static host such as GitLab Pages and have your Heroku API return the image URL. GitLab Pages is meant for static files, supports custom domains and TLS, and is a better fit than serving repo files directly. (GitLab Docs)

Example flow:

  • images live at https://assets.example.com/products/123.jpg
  • your Heroku API stores either the full URL or just the path/key
  • the API returns that URL in JSON
  • your frontend loads the image directly from the static host

Example API response:

json
{ "id": 123, "name": "Red Shirt", "imageUrl": "https://your-group.gitlab.io/your-project/images/red-shirt.jpg" }

That is usually the best option because your app server is not wasting bandwidth proxying image bytes.

2) User uploads or private images: use object storage

If users upload images, or the files need controlled/private access, use S3 or a similar object store. Heroku’s own guidance for uploads and static assets points to S3, and Heroku also highlights direct-to-cloud upload patterns because dyno storage is temporary. (devcenter.heroku.com)

Typical pattern:

  • browser uploads file to S3
  • your API stores the S3 key or public URL in the database
  • later your API returns either:
    • a public URL, or
    • a signed URL for private access

3) Only proxy through Heroku if you need auth or image processing

If the image must be permission-checked or transformed before delivery, your API can fetch it from storage and stream it back. That works, but it is usually slower and costs more than letting the browser fetch directly from storage/CDN.

Example in Express:

js
app.get("/products/:id", async (req, res) => { const product = await db.products.findById(req.params.id); res.json({ id: product.id, name: product.name, imageUrl: `${process.env.IMAGE_BASE_URL}/${product.image_path}` }); });

With Heroku config var:

bash
heroku config:set IMAGE_BASE_URL=https://your-group.gitlab.io/your-project/images

Then if image_path is red-shirt.jpg, your API returns:

json
{ "imageUrl": "https://your-group.gitlab.io/your-project/images/red-shirt.jpg" }

What to avoid

Avoid this pattern:

  • upload image to your Heroku app
  • save it under /tmp or another local folder
  • try to serve it later

That breaks because Heroku dynos restart and local writes disappear. Heroku documents that files written during runtime are temporary and can vanish after restarts, including regular platform restarts. (devcenter.heroku.com)

Best recommendation

For your case, I’d use:

  • GitLab Pages if the images are public and mostly fixed
  • S3/Cloudinary if the images are uploaded, private, or need resizing/transforms
  • Heroku API only to return URLs or signed URLs, not to permanently hold the files

If you want, I can sketch the exact setup for your stack, for example Node/Express, Rails, or Python/FastAPI.

Question

yes in node.js

Thought

Thought for 13s

Answer

Here’s the Node.js / Express setup I’d use on Heroku.

Because Heroku dynos have an ephemeral filesystem, any images you write locally during runtime can disappear on restart or redeploy, and they are not shared across dynos. Heroku recommends using external storage such as S3 for uploads and static assets instead. (Heroku)

Best pattern

Use your Heroku API for metadata and access control, not as the permanent home for image files:

  • Public, mostly fixed images: host them on GitLab Pages
  • User uploads or private images: store them in S3
  • Your Node API returns either a public URL or a signed URL, and the frontend loads the image directly from that URL. GitLab Pages is built for publishing static files from a repo, usually from public/, or another folder set with pages.publish. (GitLab Docs)

Option A: public images from GitLab Pages

Suppose your images are deployed to:

text
https://your-group.gitlab.io/your-assets/images/

Then in your Heroku app, store only the filename or path in the database.

Express example

js
const express = require("express"); const app = express(); const IMAGE_BASE_URL = process.env.IMAGE_BASE_URL; app.get("/products/:id", async (req, res) => { // pretend this comes from your DB const product = { id: req.params.id, name: "Red Shirt", imagePath: "red-shirt.jpg" }; res.json({ id: product.id, name: product.name, imageUrl: `${IMAGE_BASE_URL}/${product.imagePath}` }); }); const port = process.env.PORT || 3000; app.listen(port, () => { console.log(`Server running on ${port}`); });

Set the config var on Heroku:

bash
heroku config:set IMAGE_BASE_URL=https://your-group.gitlab.io/your-assets/images

Your frontend gets JSON like:

json
{ "id": "123", "name": "Red Shirt", "imageUrl": "https://your-group.gitlab.io/your-assets/images/red-shirt.jpg" }

And then just use it:

html
<img src="https://your-group.gitlab.io/your-assets/images/red-shirt.jpg" alt="Red Shirt">

This is the simplest approach for public assets because the browser fetches the file directly from the static host instead of making your Heroku app proxy image bytes. GitLab Pages supports static-file publishing for exactly this use case. (GitLab Docs)

Option B: uploads or private images with S3

For uploaded images, use S3. Heroku’s Node guidance shows a direct-to-S3 upload pattern: your app generates a temporary signed request, and the browser uploads straight to S3 instead of sending the file through your dyno. That avoids tying up dynos and keeps files in durable storage. (Heroku)

Typical flow:

  1. frontend asks your API for an upload URL
  2. Node API creates a signed S3 URL
  3. frontend uploads directly to S3
  4. you save the S3 key in your DB
  5. later your API returns the public or signed download URL

Minimal Express shape

js
const express = require("express"); const { S3Client, PutObjectCommand, GetObjectCommand } = require("@aws-sdk/client-s3"); const { getSignedUrl } = require("@aws-sdk/s3-request-presigner"); const app = express(); app.use(express.json()); const s3 = new S3Client({ region: process.env.AWS_REGION }); const bucket = process.env.S3_BUCKET; // create signed upload URL app.post("/uploads/sign", async (req, res) => { const { fileName, contentType } = req.body; const key = `images/${Date.now()}-${fileName}`; const command = new PutObjectCommand({ Bucket: bucket, Key: key, ContentType: contentType }); const uploadUrl = await getSignedUrl(s3, command, { expiresIn: 60 }); res.json({ key, uploadUrl }); }); // return signed download URL later app.get("/images/:key", async (req, res) => { const key = `images/${req.params.key}`; const command = new GetObjectCommand({ Bucket: bucket, Key: key }); const downloadUrl = await getSignedUrl(s3, command, { expiresIn: 300 }); res.json({ downloadUrl }); }); const port = process.env.PORT || 3000; app.listen(port, () => { console.log(`Listening on ${port}`); });

Install packages:

bash
npm install express @aws-sdk/client-s3 @aws-sdk/s3-request-presigner

Set config vars:

bash
heroku config:set AWS_REGION=us-east-1 heroku config:set S3_BUCKET=your-bucket-name heroku config:set AWS_ACCESS_KEY_ID=... heroku config:set AWS_SECRET_ACCESS_KEY=...

This matches Heroku’s recommendation to use S3 for file storage and, in Node, to upload directly to S3 rather than through the dyno. (Heroku)

What not to do on Heroku

Do not do this for persistent images:

js
app.post("/upload", (req, res) => { // save file to ./uploads });

Even if that appears to work at first, files written to the dyno’s local disk are temporary and can disappear on restart; they also are not shared across dynos. (Heroku)

My recommendation

For your stack, I’d choose:

  • GitLab Pages for fixed public images like logos, banners, product seed images
  • S3 for anything users upload or anything private
  • Heroku Node API to return the correct URL, not to permanently serve stored files itself (GitLab Docs)

If you want, I can write you a complete Express app with one of these two setups: GitLab Pages image URLs or S3 signed uploads/downloads.

Share this Q&A