Build a Video Platform: Video Transcoding Pipeline

Videos are on disk, but they're raw MP4 files. A 1080p, 30-minute lecture is roughly 2GB — you can't just serve that to a user on a mobile connection. In this post, we'll build a transcoding pipeline that converts every upload into adaptive HLS streams at multiple resolutions.
HLS (HTTP Live Streaming) splits a video into small segments (typically 10 seconds each) and creates playlists that let the player switch between quality levels on the fly. Buffering on slow connections? Switch to 360p. Fast WiFi? Jump to 720p. That's adaptive bitrate streaming.
Time commitment: 3–4 hours
Prerequisites: Phase 5: Video Upload & Storage
What we'll build in this post:
✅ FFmpeg HLS transcoding to 360p and 720p
✅ Async processing with Spring @Async and thread pool
✅ Progress tracking in Redis with percentage updates
✅ ffprobe metadata extraction (duration, resolution, codec)
✅ Automatic transcoding trigger after upload
✅ Admin UI for transcoding status and progress
✅ Master playlist generation for adaptive streaming
How HLS Works
Before diving into code, let's understand what we're building:
The output for each lesson looks like this:
hls/
└── {courseId}/
└── {lessonId}/
├── master.m3u8 # Points to quality-specific playlists
├── 360p/
│ ├── playlist.m3u8 # Lists all 360p segments
│ ├── segment-000.ts # 10-second video chunk
│ ├── segment-001.ts
│ └── ...
└── 720p/
├── playlist.m3u8
├── segment-000.ts
├── segment-001.ts
└── ...The master playlist (master.m3u8) tells the player what quality levels are available. The player picks one based on network conditions and starts downloading segments.
FFmpeg Setup
Add FFmpeg to Docker
# api/Dockerfile
FROM eclipse-temurin:21-jre-alpine
# Install FFmpeg
RUN apk add --no-cache ffmpeg
WORKDIR /app
COPY build/libs/*.jar app.jar
ENTRYPOINT ["java", "-jar", "app.jar"]For local development, install FFmpeg:
# macOS
brew install ffmpeg
# Ubuntu/Debian
sudo apt install ffmpeg
# Verify installation
ffmpeg -version
ffprobe -versionVideo Metadata Extraction with ffprobe
Before transcoding, we extract metadata from the uploaded video — duration, resolution, codec, and bitrate. This information helps us make transcoding decisions and display lesson duration to users.
// src/main/java/com/videoplatform/api/service/VideoMetadataService.java
package com.videoplatform.api.service;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.nio.file.Path;
import java.util.stream.Collectors;
@Service
public class VideoMetadataService {
private static final Logger log = LoggerFactory.getLogger(VideoMetadataService.class);
private final ObjectMapper objectMapper = new ObjectMapper();
public VideoMetadata extractMetadata(Path videoPath) {
try {
ProcessBuilder pb = new ProcessBuilder(
"ffprobe",
"-v", "quiet",
"-print_format", "json",
"-show_format",
"-show_streams",
videoPath.toString()
);
pb.redirectErrorStream(true);
Process process = pb.start();
String output;
try (BufferedReader reader = new BufferedReader(
new InputStreamReader(process.getInputStream()))) {
output = reader.lines().collect(Collectors.joining("\n"));
}
int exitCode = process.waitFor();
if (exitCode != 0) {
throw new RuntimeException("ffprobe failed with exit code: " + exitCode);
}
return parseMetadata(output);
} catch (Exception e) {
log.error("Failed to extract metadata from {}", videoPath, e);
return VideoMetadata.unknown();
}
}
private VideoMetadata parseMetadata(String json) throws Exception {
JsonNode root = objectMapper.readTree(json);
// Get duration from format
double duration = root.path("format").path("duration").asDouble(0);
// Find the video stream
JsonNode streams = root.path("streams");
int width = 0, height = 0;
String codec = "unknown";
for (JsonNode stream : streams) {
if ("video".equals(stream.path("codec_type").asText())) {
width = stream.path("width").asInt(0);
height = stream.path("height").asInt(0);
codec = stream.path("codec_name").asText("unknown");
break;
}
}
return new VideoMetadata(
(int) Math.round(duration),
width,
height,
codec
);
}
public record VideoMetadata(
int durationSeconds,
int width,
int height,
String codec
) {
public static VideoMetadata unknown() {
return new VideoMetadata(0, 0, 0, "unknown");
}
public String resolution() {
return width + "x" + height;
}
}
}Example ffprobe output (parsed internally):
{
"format": {
"duration": "1847.52",
"size": "1572864000",
"bit_rate": "6808000"
},
"streams": [
{
"codec_type": "video",
"codec_name": "h264",
"width": 1920,
"height": 1080
},
{
"codec_type": "audio",
"codec_name": "aac"
}
]
}Transcoding Service
This is the core of the pipeline. It runs FFmpeg to convert a raw video into HLS segments at multiple resolutions.
// src/main/java/com/videoplatform/api/service/TranscodingService.java
package com.videoplatform.api.service;
import com.videoplatform.api.config.StorageConfig;
import com.videoplatform.api.model.enums.LessonStatus;
import com.videoplatform.api.service.VideoMetadataService.VideoMetadata;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.scheduling.annotation.Async;
import org.springframework.stereotype.Service;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.ArrayList;
import java.util.List;
@Service
public class TranscodingService {
private static final Logger log = LoggerFactory.getLogger(TranscodingService.class);
private final StorageConfig storageConfig;
private final VideoMetadataService metadataService;
private final TranscodingProgressService progressService;
private final LessonService lessonService;
public TranscodingService(StorageConfig storageConfig,
VideoMetadataService metadataService,
TranscodingProgressService progressService,
LessonService lessonService) {
this.storageConfig = storageConfig;
this.metadataService = metadataService;
this.progressService = progressService;
this.lessonService = lessonService;
}
@Async("transcodingExecutor")
public void transcodeVideo(Long courseId, Long lessonId, String rawVideoPath) {
Path inputPath = Path.of(storageConfig.getBasePath(), rawVideoPath);
Path outputDir = Path.of(
storageConfig.getHlsPath(),
String.valueOf(courseId),
String.valueOf(lessonId)
);
try {
log.info("Starting transcoding: course={} lesson={}", courseId, lessonId);
progressService.updateProgress(lessonId, "TRANSCODING", 0);
// 1. Extract metadata
VideoMetadata metadata = metadataService.extractMetadata(inputPath);
log.info("Video metadata: {}x{}, {}s, codec={}",
metadata.width(), metadata.height(),
metadata.durationSeconds(), metadata.codec());
// Update lesson duration
lessonService.updateDuration(lessonId, metadata.durationSeconds());
// 2. Determine output resolutions
List<TranscodeProfile> profiles = getProfiles(metadata.height());
// 3. Create output directories
Files.createDirectories(outputDir);
// 4. Transcode each resolution
for (int i = 0; i < profiles.size(); i++) {
TranscodeProfile profile = profiles.get(i);
int baseProgress = (i * 90) / profiles.size();
int nextProgress = ((i + 1) * 90) / profiles.size();
transcodeToHls(inputPath, outputDir, profile, lessonId,
metadata.durationSeconds(), baseProgress, nextProgress);
}
// 5. Generate master playlist
generateMasterPlaylist(outputDir, profiles);
// 6. Update lesson status
String hlsPath = String.format("hls/%d/%d/master.m3u8", courseId, lessonId);
lessonService.updateVideoStatus(lessonId, LessonStatus.READY, hlsPath);
progressService.updateProgress(lessonId, "COMPLETE", 100);
log.info("Transcoding complete: course={} lesson={}", courseId, lessonId);
} catch (Exception e) {
log.error("Transcoding failed: course={} lesson={}", courseId, lessonId, e);
lessonService.updateVideoStatus(lessonId, LessonStatus.FAILED, null);
progressService.updateProgress(lessonId, "FAILED", -1);
}
}
private List<TranscodeProfile> getProfiles(int sourceHeight) {
List<TranscodeProfile> profiles = new ArrayList<>();
// Always include 360p
profiles.add(new TranscodeProfile("360p", 640, 360, "800k", "96k"));
// Include 720p if source is at least 720p
if (sourceHeight >= 720) {
profiles.add(new TranscodeProfile("720p", 1280, 720, "2500k", "128k"));
}
return profiles;
}
private void transcodeToHls(Path input, Path outputDir, TranscodeProfile profile,
Long lessonId, int totalDuration,
int baseProgress, int nextProgress) throws Exception {
Path profileDir = outputDir.resolve(profile.name());
Files.createDirectories(profileDir);
Path playlistPath = profileDir.resolve("playlist.m3u8");
Path segmentPattern = profileDir.resolve("segment-%03d.ts");
List<String> command = List.of(
"ffmpeg",
"-i", input.toString(),
"-vf", String.format("scale=%d:%d:force_original_aspect_ratio=decrease,pad=%d:%d:(ow-iw)/2:(oh-ih)/2",
profile.width(), profile.height(),
profile.width(), profile.height()),
"-c:v", "libx264",
"-preset", "medium",
"-b:v", profile.videoBitrate(),
"-maxrate", profile.videoBitrate(),
"-bufsize", String.valueOf(parseBitrate(profile.videoBitrate()) * 2),
"-c:a", "aac",
"-b:a", profile.audioBitrate(),
"-ac", "2",
"-ar", "44100",
"-hls_time", "10",
"-hls_list_size", "0",
"-hls_segment_filename", segmentPattern.toString(),
"-hls_playlist_type", "vod",
"-f", "hls",
"-y",
"-progress", "pipe:1",
playlistPath.toString()
);
log.info("Transcoding to {}: {}", profile.name(), String.join(" ", command));
ProcessBuilder pb = new ProcessBuilder(command);
pb.redirectErrorStream(true);
Process process = pb.start();
// Parse FFmpeg progress output
try (BufferedReader reader = new BufferedReader(
new InputStreamReader(process.getInputStream()))) {
String line;
while ((line = reader.readLine()) != null) {
if (line.startsWith("out_time_us=")) {
long microseconds = Long.parseLong(line.split("=")[1]);
int currentSeconds = (int) (microseconds / 1_000_000);
if (totalDuration > 0) {
double ratio = (double) currentSeconds / totalDuration;
int progress = baseProgress +
(int) (ratio * (nextProgress - baseProgress));
progress = Math.min(progress, nextProgress);
progressService.updateProgress(lessonId, "TRANSCODING", progress);
}
}
}
}
int exitCode = process.waitFor();
if (exitCode != 0) {
throw new RuntimeException("FFmpeg failed for " + profile.name() +
" with exit code: " + exitCode);
}
log.info("Transcoding complete for {}: {}", profile.name(), playlistPath);
}
private void generateMasterPlaylist(Path outputDir, List<TranscodeProfile> profiles) throws Exception {
StringBuilder playlist = new StringBuilder();
playlist.append("#EXTM3U\n");
playlist.append("#EXT-X-VERSION:3\n\n");
for (TranscodeProfile profile : profiles) {
int bandwidth = parseBitrate(profile.videoBitrate()) +
parseBitrate(profile.audioBitrate());
playlist.append(String.format(
"#EXT-X-STREAM-INF:BANDWIDTH=%d,RESOLUTION=%dx%d,NAME=\"%s\"\n",
bandwidth, profile.width(), profile.height(), profile.name()
));
playlist.append(profile.name()).append("/playlist.m3u8\n\n");
}
Files.writeString(outputDir.resolve("master.m3u8"), playlist.toString());
}
private int parseBitrate(String bitrate) {
String value = bitrate.replaceAll("[^0-9]", "");
int num = Integer.parseInt(value);
if (bitrate.endsWith("k")) return num * 1000;
if (bitrate.endsWith("m")) return num * 1_000_000;
return num;
}
record TranscodeProfile(
String name,
int width,
int height,
String videoBitrate,
String audioBitrate
) {}
}FFmpeg Command Breakdown
Let's break down the FFmpeg command — it's doing a lot:
| Flag | Purpose |
|---|---|
-i input.mp4 | Input file |
-vf scale=... | Resize video, maintain aspect ratio, add padding if needed |
-c:v libx264 | Encode video with H.264 (universal browser support) |
-preset medium | Balance between encoding speed and file size |
-b:v 2500k | Target video bitrate (720p) |
-maxrate / -bufsize | Constrain bitrate for consistent streaming |
-c:a aac -b:a 128k | AAC audio at 128kbps |
-hls_time 10 | Each segment is 10 seconds |
-hls_list_size 0 | Include all segments in playlist (VOD) |
-hls_playlist_type vod | Signal this is video-on-demand, not live |
-progress pipe:1 | Output progress to stdout for tracking |
-y | Overwrite output files without asking |
Resolution Selection Logic
We only transcode to resolutions lower than or equal to the source:
Upscaling a 480p video to 720p wastes bandwidth without improving quality — so we skip it.
Async Processing with @Async
Transcoding takes minutes (or longer for big files). We can't block the upload request waiting for it to finish. Spring's @Async runs the transcoding in a background thread.
Thread Pool Configuration
// src/main/java/com/videoplatform/api/config/AsyncConfig.java
package com.videoplatform.api.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.scheduling.annotation.EnableAsync;
import org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor;
import java.util.concurrent.Executor;
@Configuration
@EnableAsync
public class AsyncConfig {
@Bean(name = "transcodingExecutor")
public Executor transcodingExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(2); // 2 concurrent transcoding jobs
executor.setMaxPoolSize(4); // Burst up to 4
executor.setQueueCapacity(50); // Queue up to 50 pending jobs
executor.setThreadNamePrefix("transcode-");
executor.initialize();
return executor;
}
}Why limit to 2–4 concurrent jobs? FFmpeg is CPU-intensive. On a typical VPS with 4 cores, running more than 2 transcoding jobs simultaneously makes all of them slower. Better to queue and process sequentially.
Progress Tracking in Redis
Redis stores the real-time transcoding progress so any client (including the admin dashboard) can poll for status:
// src/main/java/com/videoplatform/api/service/TranscodingProgressService.java
package com.videoplatform.api.service;
import org.springframework.data.redis.core.StringRedisTemplate;
import org.springframework.stereotype.Service;
import java.time.Duration;
import java.util.HashMap;
import java.util.Map;
@Service
public class TranscodingProgressService {
private static final String KEY_PREFIX = "transcode:progress:";
private static final Duration TTL = Duration.ofHours(24);
private final StringRedisTemplate redisTemplate;
public TranscodingProgressService(StringRedisTemplate redisTemplate) {
this.redisTemplate = redisTemplate;
}
public void updateProgress(Long lessonId, String status, int percentage) {
String key = KEY_PREFIX + lessonId;
Map<String, String> data = new HashMap<>();
data.put("status", status);
data.put("percentage", String.valueOf(percentage));
data.put("updatedAt", String.valueOf(System.currentTimeMillis()));
redisTemplate.opsForHash().putAll(key, data);
redisTemplate.expire(key, TTL);
}
public TranscodeProgress getProgress(Long lessonId) {
String key = KEY_PREFIX + lessonId;
Map<Object, Object> data = redisTemplate.opsForHash().entries(key);
if (data.isEmpty()) {
return new TranscodeProgress("UNKNOWN", 0);
}
return new TranscodeProgress(
(String) data.getOrDefault("status", "UNKNOWN"),
Integer.parseInt((String) data.getOrDefault("percentage", "0"))
);
}
public void clearProgress(Long lessonId) {
redisTemplate.delete(KEY_PREFIX + lessonId);
}
public record TranscodeProgress(String status, int percentage) {}
}Why Redis instead of a database column? Because progress updates happen every few seconds during transcoding. Writing to PostgreSQL that frequently would be wasteful. Redis handles it effortlessly, and the data is ephemeral — we don't need it once transcoding is done.
Trigger Transcoding After Upload
Update the upload controller to kick off transcoding automatically:
// Update AdminUploadController.java — modify the uploadVideo method
@PostMapping("/video")
public ResponseEntity<ApiResponse<UploadResponse>> uploadVideo(
@RequestParam("file") MultipartFile file,
@RequestParam("courseId") Long courseId,
@RequestParam("lessonId") Long lessonId) {
// 1. Validate file
validationService.validateVideoFile(file);
// 2. Store on disk
String videoPath = storageService.storeRawVideo(file, courseId, lessonId);
// 3. Update lesson record
lessonService.updateVideoPath(lessonId, videoPath);
// 4. Trigger async transcoding
transcodingService.transcodeVideo(courseId, lessonId, videoPath);
// 5. Return response (transcoding happens in background)
UploadResponse response = new UploadResponse(
videoPath,
file.getSize(),
file.getOriginalFilename(),
file.getContentType()
);
return ResponseEntity.ok(
ApiResponse.success("Video uploaded. Transcoding started.", response)
);
}The response returns immediately after the file is saved. Transcoding runs in the background — the admin can check progress via the status endpoint.
Add LessonService Methods
// Add to LessonService.java
@Transactional
public void updateDuration(Long lessonId, int durationSeconds) {
Lesson lesson = lessonRepository.findById(lessonId)
.orElseThrow(() -> new ResourceNotFoundException("Lesson not found: " + lessonId));
lesson.setDuration(durationSeconds);
lessonRepository.save(lesson);
// Update course total duration
courseService.refreshCourseStats(lesson.getSection().getCourse().getId());
}
@Transactional
public void updateVideoStatus(Long lessonId, LessonStatus status, String hlsPath) {
Lesson lesson = lessonRepository.findById(lessonId)
.orElseThrow(() -> new ResourceNotFoundException("Lesson not found: " + lessonId));
lesson.setStatus(status);
if (hlsPath != null) {
lesson.setVideoPath(hlsPath);
}
lessonRepository.save(lesson);
}Transcoding Status API
Add an endpoint to check transcoding progress:
// Add to AdminUploadController.java
@GetMapping("/transcode-status/{lessonId}")
public ResponseEntity<ApiResponse<TranscodeStatusResponse>> getTranscodeStatus(
@PathVariable Long lessonId) {
var progress = progressService.getProgress(lessonId);
return ResponseEntity.ok(ApiResponse.success(
new TranscodeStatusResponse(
progress.status(),
progress.percentage()
)
));
}// src/main/java/com/videoplatform/api/dto/response/TranscodeStatusResponse.java
package com.videoplatform.api.dto.response;
public record TranscodeStatusResponse(
String status,
int percentage
) {}Admin UI: Transcoding Progress
Update the lesson row in the admin dashboard to show transcoding status:
// web/src/components/admin/TranscodeStatus.tsx
"use client";
import { useEffect, useState } from "react";
import { adminApi } from "@/lib/admin-api";
import { Progress } from "@/components/ui/progress";
import { Badge } from "@/components/ui/badge";
import { Loader2, CheckCircle2, XCircle, Clock } from "lucide-react";
interface TranscodeStatusProps {
lessonId: number;
lessonStatus: string;
}
export function TranscodeStatus({ lessonId, lessonStatus }: TranscodeStatusProps) {
const [progress, setProgress] = useState<{ status: string; percentage: number } | null>(null);
const [polling, setPolling] = useState(lessonStatus === "PROCESSING");
useEffect(() => {
if (!polling) return;
const interval = setInterval(async () => {
try {
const data = await adminApi.getTranscodeStatus(lessonId);
setProgress(data);
if (data.status === "COMPLETE" || data.status === "FAILED") {
setPolling(false);
}
} catch {
setPolling(false);
}
}, 2000); // Poll every 2 seconds
return () => clearInterval(interval);
}, [lessonId, polling]);
if (lessonStatus === "READY") {
return (
<Badge variant="outline" className="text-green-600">
<CheckCircle2 className="h-3 w-3 mr-1" />
Ready
</Badge>
);
}
if (lessonStatus === "FAILED") {
return (
<Badge variant="outline" className="text-red-600">
<XCircle className="h-3 w-3 mr-1" />
Failed
</Badge>
);
}
if (lessonStatus === "PROCESSING" && progress) {
return (
<div className="flex items-center gap-2 min-w-[150px]">
<Loader2 className="h-3 w-3 animate-spin text-primary" />
<Progress value={progress.percentage} className="h-1.5 flex-1" />
<span className="text-xs text-muted-foreground">{progress.percentage}%</span>
</div>
);
}
if (lessonStatus === "PROCESSING") {
return (
<Badge variant="outline" className="text-yellow-600">
<Clock className="h-3 w-3 mr-1" />
Queued
</Badge>
);
}
return null;
}// Add to admin-api.ts
getTranscodeStatus: (lessonId: number) =>
api
.get<{ data: { status: string; percentage: number } }>(
`/api/admin/upload/transcode-status/${lessonId}`
)
.then((r) => r.data.data),Complete Transcoding Pipeline
Here's the full flow from upload to playable HLS:
Master Playlist Example
Here's what the generated master.m3u8 looks like:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-STREAM-INF:BANDWIDTH=896000,RESOLUTION=640x360,NAME="360p"
360p/playlist.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=2628000,RESOLUTION=1280x720,NAME="720p"
720p/playlist.m3u8And a quality-specific playlist.m3u8:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:10
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-PLAYLIST-TYPE:VOD
#EXTINF:10.000000,
segment-000.ts
#EXTINF:10.000000,
segment-001.ts
#EXTINF:10.000000,
segment-002.ts
#EXTINF:7.520000,
segment-003.ts
#EXT-X-ENDLISTTesting the Pipeline
1. Upload and Transcode
# Upload a video — transcoding starts automatically
curl -X POST http://localhost:8080/api/admin/upload/video \
-H "Authorization: Bearer ${TOKEN}" \
-F "file=@sample-video.mp4" \
-F "courseId=1" \
-F "lessonId=1"2. Check Progress
# Poll transcoding status
watch -n 2 'curl -s http://localhost:8080/api/admin/upload/transcode-status/1 \
-H "Authorization: Bearer ${TOKEN}" | jq'You'll see the progress increment:
{ "status": "TRANSCODING", "percentage": 23 }
{ "status": "TRANSCODING", "percentage": 47 }
{ "status": "TRANSCODING", "percentage": 71 }
{ "status": "TRANSCODING", "percentage": 95 }
{ "status": "COMPLETE", "percentage": 100 }3. Verify HLS Output
# Check generated files
ls -la /data/videos/hls/1/1/
# master.m3u8 360p/ 720p/
cat /data/videos/hls/1/1/master.m3u8
ls /data/videos/hls/1/1/360p/
# playlist.m3u8 segment-000.ts segment-001.ts ...4. Test Playback (Quick Check)
You can test HLS playback locally with ffplay:
ffplay /data/videos/hls/1/1/master.m3u8Or serve it with a quick Python HTTP server and open it in a browser with an HLS player extension.
Common Mistakes
1. Running FFmpeg Synchronously
Never run FFmpeg in the request thread — it blocks the HTTP connection for minutes:
// WRONG — blocks the request
@PostMapping("/upload")
public ResponseEntity<?> upload(MultipartFile file) {
storeFile(file);
runFFmpeg(file); // Blocks for 5+ minutes!
return ResponseEntity.ok("Done");
}
// RIGHT — async background processing
@PostMapping("/upload")
public ResponseEntity<?> upload(MultipartFile file) {
storeFile(file);
transcodingService.transcodeVideo(...); // Returns immediately
return ResponseEntity.ok("Transcoding started");
}2. Not Limiting Thread Pool Size
Unlimited FFmpeg processes will kill your server:
// WRONG — unlimited threads
@Async
public void transcode() { ... }
// RIGHT — bounded thread pool
@Async("transcodingExecutor") // Uses our configured pool with max 4 threads
public void transcode() { ... }3. Ignoring FFmpeg Exit Codes
FFmpeg returns non-zero exit codes on failure. Always check:
int exitCode = process.waitFor();
if (exitCode != 0) {
throw new RuntimeException("FFmpeg failed: exit code " + exitCode);
}4. Hardcoding Resolutions
If someone uploads a 480p video, don't upscale to 720p:
// WRONG — always transcode to both
profiles.add(new TranscodeProfile("360p", ...));
profiles.add(new TranscodeProfile("720p", ...));
// RIGHT — check source resolution first
if (sourceHeight >= 720) {
profiles.add(new TranscodeProfile("720p", ...));
}What's Next?
HLS files are on disk, but there's no way to serve them to browsers yet. In Post #8, we'll set up secure video streaming:
- Nginx configuration for HLS file serving
secure_linkmodule for token-based URL validation- Signed URL generation in Spring Boot
- CORS headers for cross-origin video playback
- Preventing direct download of video segments
Time to serve those HLS streams to the world — securely.
Series: Build a Video Streaming Platform
Previous: Phase 5: Video Upload & Storage
Next: Phase 7: Secure Video Streaming
📬 Subscribe to Newsletter
Get the latest blog posts delivered to your inbox every week. No spam, unsubscribe anytime.
We respect your privacy. Unsubscribe at any time.
💬 Comments
Sign in to leave a comment
We'll never post without your permission.