Quick Search of Hidden Photos With Optimization Approaches
A dramatic performance improvement of my original photo search app.
Have you ever found yourself drowning in a sea of unorganized photos? I know I have!
The other day, while experimenting with my photo finder app to learn more about Generative AI, I successfully found the images I was looking for. Fantastic!
But then, an idea struck me: What happens when thousands of photos are in the folder?
It’s not far-fetched — think about how many photos your friends take of their fur babies daily. This realization pushed me to explore handling large image collections effectively without overwhelming system resources.
The Original Feature: A Simple Beginning
The app started with two key features:
Loading all images from a selected folder.
Search photos based on the content.
By sending those images to the backend to filter and return the best matches.
Here’s the abbreviated code that powered this:
// Load all images from a selected folder
async handleFolderSelect() {
const dirHandle = await window.showDirectoryPicker();
const images = [];
// Load all images at once
for await (const entry of dirHandle.values()) {
if (entry.kind === 'file' && entry.name.match(/\.(jpg|jpeg|png|gif)$/i)) {
const file = await entry.getFile();
const blobUrl = URL.createObjectURL(file);
images.push({
path: blobUrl,
name: entry.name,
handle: entry
});
}
}
this.images = images;
this.allImages = images;
}
// Search images based on content
async searchImages() {
// Create a FormData object to send all images
const formData = new FormData();
formData.append('query', this.searchQuery);
// Add all current images to the request
for (const image of this.images) {
if (image.handle) {
// Get the actual file from the handle
const file = await image.handle.getFile();
formData.append('images[]', file, image.name);
}
}
const response = await fetch('/search', {
method: 'POST',
body: formData
});
// Update images with the search results
this.images = ...
}
While this works perfectly for smaller folders, the cracks appear when dealing with thousands of photos.
Challenges with Large Folders
Handling a folder with 1,000+ images is no small feat. Here are the main challenges:
Performance Issues
Loading all images at once can slow everything down, from your browser to your server.Memory Usage
Creating blob URLs for 1,000 images and sending them all to the server in a single request can overwhelm your system’s memory.User Experience
Long wait times and an unresponsive app are frustrating. Users need feedback and interactivity, even with large data sets.
Clearly, a better approach was needed.
Batch Processing: A Smarter Way
The first step toward scalability is batch processing.
Batch Processing Works by:
Improved Memory Management: Smaller batches mean fewer resources are consumed at once.
Network Efficiency: Smaller HTTP requests are less likely to time out and are easier to retry if they fail.
Responsive Server: Processing smaller batches reduces the risk of out-of-memory (OOM) errors.
By processing images in smaller groups — say, 50 to 100 at a time — you can manage resources more effectively.
Here’s how I implemented it:
async searchImages() {
const BATCH_SIZE = 50;
let allResults = [];
for (let i = 0; i < this.allImages.length; i += BATCH_SIZE) {
const batch = this.allImages.slice(i, i + BATCH_SIZE);
const formData = new FormData();
formData.append('query', this.searchQuery);
for (const image of batch) {
const file = await image.handle.getFile();
formData.append('images[]', file, image.name);
}
const response = await fetch('/search', { method: 'POST', body: formData });
const data = await response.json();
allResults = allResults.concat(data.results);
}
this.images = allResults.sort((a, b) => b.score - a.score);
}
Batch processing provides much-needed control over memory usage and server load, but it doesn’t seem to resolve some of the UI problems:
What about user experience? Is it giving immediate feedback?
Progressive Loading: Immediate Feedback
For a more interactive experience, I turned to progressive loading. This approach processes and displays images in smaller chunks (e.g., 10 at a time), providing immediate feedback to the user.
Benefits of Progressive Loading
Better User Experience: Results appear incrementally, keeping users engaged.
Lower Memory Usage: Only a small number of images are loaded at any given time.
Responsive Interface: Updates happen dynamically, avoiding long delays.
Here’s how it works:
async searchImages() {
const CHUNK_SIZE = 10;
this.isLoading = true;
this.images = [];
try {
for (let i = 0; i < this.allImages.length; i += CHUNK_SIZE) {
const chunk = this.allImages.slice(i, i + CHUNK_SIZE);
const results = await this.processImageChunk(chunk);
this.images = [...this.images, ...results].sort((a, b) => b.score - a.score);
await new Promise(resolve => setTimeout(resolve, 0)); // Let UI update
}
} finally {
this.isLoading = false;
}
}
This method strikes a great balance between performance and user satisfaction, especially for large collections.
Progressive loading was a significant improvement, but it revealed new challenges. While it successfully delivers search results incrementally, several critical questions emerged:
How would the app perform when the search results grows into the thousands?
Can we maintain smooth UI interactions with such a large dataset?
What’s the most efficient way to render thousands of images without overwhelming the browser?
Frontend Display: Infinite Scrolling
When it comes to displaying large image collections in a web interface, lazy loading combined with infinite scrolling offers an elegant solution. By loading images only when they’re about to appear in the viewport, you save memory and improve performance.
Here’s how I used Intersection Observer for lazy loading:
mounted() {
this.observer = new IntersectionObserver(
(entries) => {
entries.forEach(async entry => {
if (entry.isIntersecting) {
const image = entry.target.__vue_data__;
if (image && !image.path) {
const file = await image.handle.getFile();
image.path = URL.createObjectURL(file);
}
}
});
},
{ root: null, rootMargin: '50px', threshold: 0.1 }
);
}
beforeUnmount() {
// Remove all observers
this.unobserveImages();
// Clean up blob URLs
this.clearBlobUrls();
// Disconnect the observer
this.observer.disconnect();
}
Combining html and javascript methods, this implementation :
Only loads images as they become visible in the viewport
Uses placeholders for unloaded images
Preloads images slightly before they become visible
Properly handles component cleanup
With this setup, images load just in time as the user scrolls through the gallery, ensuring a smooth browsing experience.
Choosing the Right Approaches
Here is a summary for the best suitable use cases:
Batch Processing is ideal when:
Search accuracy is priority
Working with smaller collections (<500 images)
Need precise sorting of results
Server-side processing is more important
Progressive Loading is best suited for:
Dealing with very large collections (1000+ images)
Want immediate feedback
Building a browsing-focused interface
Infinite Scrolling excel at:
Handling lazy loading automatically as users scroll
Ensuring smooth performance
Rest assured there is no right or wrong in choosing what strategies for handling large file collections. These approaches aren’t mutually exclusive — they can be combined to leverage their respective strengths.
Frontend Optimization Benefits
A strategic frontend-first approach offers significant advantages over bulk loading everything at the backend. By restricting the data being sent to the server, it:
Reduces network traffic
Lowers server load
Improves memory usage on both ends
Gets faster initial loading
Enhances scalability
This optimization is particularly crucial for image-heavy applications, where file sizes can significantly impact performance.
The Final Outcome
The final optimized version looks like this:
async handleFolderSelect() {
try {
const dirHandle = await window.showDirectoryPicker();
this.isLoading = true;
this.currentFolder = dirHandle.name;
this.allImages = [];
this.images = [];
// Clear existing blob URLs and remove image caches
this.clearBlobUrls();
this.unobserveImages();
// Process files progressively
for await (const entry of dirHandle.values()) {
if (entry.kind === 'file' && entry.name.match(/\.(jpg|jpeg|png|gif)$/i)) {
// Store handle first without creating blob URL
const imageData = {
name: entry.name,
handle: entry,
path: null // Will be created on demand
};
this.allImages.push(imageData);
}
}
// Load first batch of images
const initialBatch = this.allImages.slice(0, 50);
for (const imageData of initialBatch) {
const file = await imageData.handle.getFile();
imageData.path = URL.createObjectURL(file);
}
// Update display
this.images = this.allImages;
// Wait for DOM update
await this.$nextTick();
// Start observing images
this.observeImages();
// Store the directory handle for future access
this.$root.dirHandle = dirHandle;
} catch (error) {
console.error('Error:', error);
alert('Error accessing folder:' + error.message);
} finally {
this.isLoading = false;
}
},
async searchImages() {
const BATCH_SIZE = 50;
this.isLoading = true;
try {
let allResults = [];
const totalBatches = Math.ceil(this.allImages.length / BATCH_SIZE);
// Process images in batches
for (let i = 0; i < this.allImages.length; i += BATCH_SIZE) {
const currentBatch = Math.floor(i / BATCH_SIZE) + 1;
this.progress = `Processing batch ${currentBatch}/${totalBatches}`;
console.log('*** progress = ', this.progress);
const batch = this.allImages.slice(i, i + BATCH_SIZE);
const formData = new FormData();
formData.append('query', this.searchQuery);
for (const image of batch) {
if (image.handle) {
const file = await image.handle.getFile();
formData.append('images[]', file, image.name);
}
}
const batchResults = await fetch('/search', {
method: 'POST',
body: formData
}).then(r => r.json());
const curResults = batchResults.results.map(result => {
const existingImage = this.allImages.find(img => img.name === result.name);
const path = existingImage ? existingImage.path : URL.createObjectURL(new Blob([result.image_data], { type: 'image/jpeg' }));
return {
path: path,
name: result.name,
score: result.score,
handle: existingImage ? existingImage.handle : null
};
});
// Show intermediate results while processing continues
allResults = [...allResults, ...curResults];
this.images = allResults.sort((a, b) => b.score - a.score);
}
} catch (error) {
console.error('Search error:', error);
alert(error.message);
} finally {
this.isLoading = false;
}
},
Final Thoughts
This journey into building an image search app taught me a ton!
The biggest takeaway is about finding that sweet spot between snappy performance, smart memory use, and making things feel smooth for users. Whether you go with batch processing or progressive loading, there’s always room to make things better.
If you’re building something similar, think about how apps like Facebook or Instagram handle massive data sets. What can you learn from their approaches?
This project was a step forward in managing large image collections, and I’m excited to continue refining it. Got questions or ideas? Let me know!
Hi Jenny!
It looks quite useful.
Is this something that i can use or it was a local thing which you did?
(I am not a very tech person, so all i can make out is that, yes, an app has been made)