Fetch chunk for matches
बनाया गया: 17 जनवरी 2025
बनाया गया: 17 जनवरी 2025
Optimizing memory usage
I’m considering a more efficient way to handle large files. By passing just one file chunk to the component and loading others on demand, memory usage will be minimized.
Updating SearchedItem
Ensuring SearchedItem updates with relevant text chunks upon arrow clicks. Managing chunks to be loaded on demand via a parent function to optimize memory usage.
Managing state updates
I’m detailing how to manage chunk loading for SearchedItem: fetching new chunks via a parent function, updating the "currentMatchIndex" state, and ensuring re-rendering with updated data.
Crafting the solution
I’m outlining how to manage file content efficiently by implementing a "fetchChunkForIndex" function, allowing fetching of specific chunks and updates for the best match and other search results.
Ensuring efficient data retrieval
I’m working on adding "onRequestChunk" as a callback prop in SearchedItem. This prop will invoke the parent function to retrieve specific file chunks, improving memory and data management.
Progressing through solutions
I'm detailing how the child manages chunk navigation and data updates by invoking "onRequestChunk" for the previous or next match, ensuring efficient data handling and state management.
Mapping retrieval
I'm working through a function to fetch file chunks based on match indexes, using localFileManager
for local files or different logic for GDrive, ensuring specific data retrieval and efficient updates.
Refining data handling
I’m optimizing data handling by managing chunk retrieval effectively. This approach might involve tweaking the parent component to efficiently provide partial text for the best match.
Enhancing with dynamic data
I’m working on integrating a function for asynchronous chunk retrieval into SearchedItem. This ensures it fetches and updates chunks efficiently, maintaining an optimal state for each match.
Loading and enhancing
I’m assembling a mixture of componentDidMount
and constructor methods to manage initial chunk loading via parent functions, ensuring precise chunk updating with scores and match indexes.
Mapping data flow
I’m outlining the process of retrieving file chunks for SearchedItem, managing state, and updating UI elements. This ensures a seamless user experience with efficient data handling.
Enhancing user experience
I’m thinking through ways to improve the user experience, including scoring, match indexing, and visual emphasis on the query substring, all while ensuring efficient data handling.
Charting data handling
I’m mapping out the parent component's role in efficiently handling chunk fetching, ensuring SearchedItem gets partial data for optimal performance. Progress is steady and structured.
Thought about SearchedItem updates for 1m 9s
Below is one possible way to modify SearchedItem so that:
otherSearchResults
) by clicking on the left and right arrows.The main idea is:
onFetchChunk
) from the parent (where you do have access to the file ID and can retrieve partial content on-demand).currentMatchIndex
(which match is being displayed), its chunk
of text, and the score
.Wherever you render the list of results (e.g. in your renderItem
for the FlatList
), add something like onFetchChunk
to each item.
For example, if you already have a method that can retrieve partial file content based on (fileId, firstCharIndex)
, you could do something like:
function getPartialFileContent(fileId, start, length) { // You can implement a method in localFileManager or gdriveFileManager // that only reads a small chunk of the file, not the entire content. // For example: // return localFileManager.getFileContentRange(fileId, start, length); // Or in a simpler approach, if your AISearchByFileContent already stores // the entire content in memory, you could slice it. But to truly save memory, // you want a partial read approach. This is just an example stub: return localFileManager.getFileContentPartial(fileId, start, length); }
Then, create a callback function that knows how to fetch the chunk for a given match index:
async function onFetchChunk(item, matchIndex) { const match = item.otherSearchResults[matchIndex]; if (!match) return null; const startPos = match.firstCharIndex; const endPos = startPos + 1500; // or your MAX_CHUNK_LENGTH // fetch partial content from local or GDrive based on item.fileType let chunk = ''; try { chunk = await getPartialFileContent(item.fileId, startPos, endPos); } catch (err) { console.error('Error fetching partial file content:', err); chunk = t('Failed to fetch this part of the file.'); } return { text: chunk, score: match.score, matchIndex }; }
Finally, pass that function down via the item:
const renderItem = ({ item, index }) => { return ( <SearchedItem index={index} item={item} // existing props onLongPress={() => selectFile(index)} onPress={() => handlePress(item, index)} headStyle={[item.isError && multiStyles.errorItemHead, item.isSelected && multiStyles.selectItemHead]} bodyStyle={[item.isError && multiStyles.errorItemBody, item.isSelected && multiStyles.selectItemBody]} bottomStyle={[item.isError && multiStyles.errorItemBottom, item.isSelected && multiStyles.selectItemBottom]} // NEW PROP for partial content fetch onFetchChunk={onFetchChunk} /> ); };
SearchedItem.js
Below is a class-based version of SearchedItem
that handles the logic of fetching new chunks when the user taps the left or right arrow. (Compare with your existing file and adjust as needed.)
// SearchedItem.js import React, { Component } from 'react'; import { ScrollView, Text, View, TouchableOpacity, ActivityIndicator, StyleSheet } from 'react-native'; import { EditorIcons } from './Icons'; import multiStyles from '../../styles/multifileSearchStyle'; import mainStyles from '../../styles/mainStyles'; import { isProApp } from '../appSwitcher'; import { HOME_FOLDER_NAME, FOLDER_NAME_SEPARATOR, GOOGLE_DRIVE, SHARED_FOLDER_NAME, IMPORTED_FILES_LABEL } from '../constantsAndFunctions'; import { FileType } from '../AppFile'; import { adjustSize } from '../../styles/adjustFontSize'; import { t } from '../../language/i18n'; export class SearchedItem extends Component { constructor(props) { super(props); this.scrollV = React.createRef(); // If the parent already provides a small chunk in `item.text` for the "best match", // initialize with that. Otherwise, empty string or a placeholder. this.state = { currentMatchIndex: 0, chunk: props.item.text || '', score: props.item.score || 0, isLoadingChunk: false, }; } /** * If we have multiple matches, we load the chunk * for the currently selected match index. */ async componentDidMount() { const { item, onFetchChunk } = this.props; // If we have `otherSearchResults` and the initial chunk is not provided, // or if you want to ensure we always fetch from the parent: if (item.otherSearchResults && item.otherSearchResults.length > 0 && !item.text) { this.loadMatchChunk(0); } } /** * Decides whether to update or not. * We compare old vs new props, plus local state changes. */ shouldComponentUpdate(nextProps, nextState) { const item = this.props.item; const nextItem = nextProps.item; // Compare references if (item.fileId !== nextItem.fileId) return true; if (item.text !== nextItem.text) return true; if (item.isAiSearchable !== nextItem.isAiSearchable) return true; if (item.isLoading !== nextItem.isLoading) return true; if (item.isOpening !== nextItem.isOpening) return true; if (item.pathQuery !== nextItem.pathQuery) return true; if (item.isSelected !== nextItem.isSelected) return true; if (item.isError !== nextItem.isError) return true; if (this.props.index !== nextProps.index) return true; // Compare local state if (this.state.currentMatchIndex !== nextState.currentMatchIndex) return true; if (this.state.chunk !== nextState.chunk) return true; if (this.state.score !== nextState.score) return true; if (this.state.isLoadingChunk !== nextState.isLoadingChunk) return true; return false; } /** * Helper that calls the parent's `onFetchChunk(item, newMatchIndex)` * to get the chunk text and score, then updates local state. */ loadMatchChunk = async (newIndex) => { const { item, onFetchChunk } = this.props; if (!onFetchChunk) return; this.setState({ isLoadingChunk: true }); try { const data = await onFetchChunk(item, newIndex); // data should be { text, score, matchIndex } if (data) { this.setState({ currentMatchIndex: data.matchIndex, chunk: data.text, score: data.score, isLoadingChunk: false, }); } else { this.setState({ isLoadingChunk: false }); } } catch (err) { console.error('Error loading chunk:', err); this.setState({ chunk: t('Error loading match content'), score: 0, isLoadingChunk: false, }); } }; handlePrevMatch = () => { const { currentMatchIndex } = this.state; if (currentMatchIndex > 0) { this.loadMatchChunk(currentMatchIndex - 1); } }; handleNextMatch = () => { const { item } = this.props; const { currentMatchIndex } = this.state; if (!item.otherSearchResults) return; if (currentMatchIndex < item.otherSearchResults.length - 1) { this.loadMatchChunk(currentMatchIndex + 1); } }; /** * Renders a small AI badge if isAiSearchable===1 (in Pro version). */ showAiSearchable = () => { if (!isProApp) return null; if (this.props.item.isAiSearchable === 1) { return ( <Text style={{ paddingHorizontal: adjustSize(5), fontSize: adjustSize(14), color: 'black' }}> Ai </Text> ); } return null; }; render() { const item = this.props.item; const pathQuery = item.pathQuery; const { chunk, score, currentMatchIndex, isLoadingChunk, } = this.state; // Build display for fileId path let fileIdToShow = item.fileId; if (item.fileType === FileType.HOME_FILE) { fileIdToShow = t(HOME_FOLDER_NAME) + '/' + item.relativeFilePath; } else if (item.fileType === FileType.G_DRIVE_FILE) { fileIdToShow = t(GOOGLE_DRIVE) + '/' + item.relativeFilePath.replace(new RegExp(FOLDER_NAME_SEPARATOR, 'g'), '/'); } else if (item.fileType === FileType.SHARED_FILE) { fileIdToShow = t(SHARED_FOLDER_NAME) + '/' + item.relativeFilePath; } else if (item.fileType === FileType.NEW_FILE) { fileIdToShow = '* ' + item.relativeFilePath; } else if (item.fileType === FileType.IMPORTED_FILE) { fileIdToShow = '* ' + t(IMPORTED_FILES_LABEL) + '/' + item.relativeFilePath; } // Highlight path query if present let queryMatch = -1; let pathPrefix = fileIdToShow; let pathMatch = ''; let pathSuffix = ''; if (pathQuery) { queryMatch = fileIdToShow.toLowerCase().lastIndexOf(pathQuery.toLowerCase()); if (queryMatch !== -1) { pathPrefix = fileIdToShow.substring(0, queryMatch); pathMatch = fileIdToShow.substring(queryMatch, queryMatch + pathQuery.length); pathSuffix = fileIdToShow.substring(queryMatch + pathQuery.length); } } // We keep item.queryIndex for pure highlighting inside chunk if needed const queryIndex = item.queryIndex === undefined ? { start: 0, end: 0 } : item.queryIndex; return ( <View style={multiStyles.searchedItem}> <View style={[ multiStyles.searchedItemBorder, item.isError && { borderColor: '#FF737D' }, item.isSelected && { borderColor: '#99D1FF' }, ]} > {/* -- HEAD (filename/ path row) -- */} <View style={[ multiStyles.normalItemHead, this.props.headStyle, item.isLoading && { paddingRight: 0 }, ]} > <ScrollView horizontal={true} showsHorizontalScrollIndicator={false} ref={this.scrollV} onContentSizeChange={() => { this.scrollV.current?.scrollToEnd(); }} > <View> {queryMatch !== -1 ? ( <Text style={mainStyles.showPathText}> <Text>{pathPrefix}</Text> <Text style={mainStyles.match}>{pathMatch}</Text> <Text>{pathSuffix}</Text> </Text> ) : ( <Text style={mainStyles.showPathText}>{fileIdToShow} </Text> )} </View> </ScrollView> {item.isLoading && ( <View style={{ paddingRight: adjustSize(7), flexDirection: 'row', justifyContent: 'center', alignItems: 'center', }} > <ActivityIndicator size={adjustSize(14)} color="#000" /> </View> )} {item.isLoading && this.scrollV.current?.scrollToEnd()} {this.showAiSearchable()} </View> {/* -- BODY (the chunk of text) -- */} <TouchableOpacity onLongPress={this.props.onLongPress} delayLongPress={1000} onPress={this.props.onPress} > <View style={[ multiStyles.normalItemBodyView, !item.email && { borderBottomLeftRadius: adjustSize(20) }, this.props.bodyStyle, ]} > {isLoadingChunk ? ( <View style={{ padding: adjustSize(10) }}> <ActivityIndicator size={adjustSize(24)} color="#888" /> </View> ) : ( <Text style={multiStyles.normalItemBodyText}> <Text>{chunk.substring(0, queryIndex.start)}</Text> <Text style={mainStyles.match}> {chunk.substring(queryIndex.start, queryIndex.end)} </Text> <Text>{chunk.substring(queryIndex.end)}</Text> </Text> )} </View> {item.email && ( <View style={[multiStyles.normalItemBottom, this.props.bottomStyle]}> <Text style={multiStyles.normalItemBottomText}>{item.email}</Text> </View> )} {item.isOpening && ( <View style={{ width: '100%', height: '100%', position: 'absolute', justifyContent: 'center', backgroundColor: 'rgba(255, 255, 255, 0.5)', }} > <ActivityIndicator size={adjustSize(32)} color="#888" /> </View> )} </TouchableOpacity> </View> {/* -- FOOTER (score and navigation for multiple matches) -- */} {typeof score === 'number' && ( <View style={styles.scoreContainer}> <View style={styles.textContainer}> <Text style={styles.text}> {t('Match: ') + (score * 100).toFixed(2) + '%'} </Text> </View> {item.otherSearchResults && item.otherSearchResults.length > 1 && ( <View style={styles.navigationContainer}> <TouchableOpacity style={[styles.button, styles.button1]} onPress={this.handlePrevMatch} disabled={currentMatchIndex <= 0} > <EditorIcons name="left_arrow_0_padding" size={22} color="#333" /> </TouchableOpacity> <View style={styles.textContainer}> <Text style={styles.text}> {currentMatchIndex + 1}/{item.otherSearchResults.length} </Text> </View> <TouchableOpacity style={[styles.button, styles.button2]} onPress={this.handleNextMatch} disabled={ currentMatchIndex >= item.otherSearchResults.length - 1 } > <EditorIcons name="right_arrow_0_padding" size={22} color="#333" /> </TouchableOpacity> </View> )} </View> )} </View> ); } } const styles = StyleSheet.create({ scoreContainer: { flexDirection: 'row', paddingTop: 3, alignItems: 'flex-start', justifyContent: 'space-between', width: '100%', }, textContainer: { height: adjustSize(22), flexDirection: 'row', alignItems: 'center', }, text: { color: '#000', }, navigationContainer: { flexDirection: 'row', alignItems: 'flex-start', justifyContent: 'center', }, button: { height: adjustSize(48), width: adjustSize(48), minHeight: 48, minWidth: 48, flexDirection: 'row', alignItems: 'center', }, button1: { justifyContent: 'flex-start' }, button2: { justifyContent: 'flex-end' }, });
onFetchChunk
prop
onFetchChunk
) so that the child SearchedItem
does not need the entire file content. It only asks the parent for the chunk it needs for the given matchIndex
.onFetchChunk(item, matchIndex)
is expected to return an object like { text, score, matchIndex }
.Local state
SearchedItem
keeps its own currentMatchIndex
, chunk
, and score
in state.props.item.text
, so we initialize this.state.chunk
with that.Arrows to navigate
handlePrevMatch()
and handleNextMatch()
simply call loadMatchChunk()
with the new index. That function calls onFetchChunk
from the parent and sets the chunk in local state.Avoid passing the entire file content
item.content
. It only passes the small excerpt for the best match in item.text
, plus a callback onFetchChunk
to fetch any other match on demand.In your parent’s code (e.g., the screen or component that uses AISearchByFileContent
results), you will have something like this:
async function onFetchChunk(item, matchIndex) { const match = item.otherSearchResults[matchIndex]; if (!match) return null; const startPos = match.firstCharIndex; const length = 2000; // or your chunk size (MAX_CHUNK_LENGTH) let chunk; try { // Use whichever fileManager is appropriate: if (item.fileType === FileType.G_DRIVE_FILE) { chunk = await gdriveFileManager.getPartialContent(item.fileId, startPos, length); } else { chunk = await localFileManager.getFileContentPartial(item.fileId, startPos, length); } } catch (err) { console.error('Error fetching partial content:', err); return { text: t('Error loading match content'), score: 0, matchIndex }; } return { text: chunk, score: match.score, // or compute something else matchIndex, }; }
Then in your renderItem
:
const renderItem = ({ item, index }) => { return ( <SearchedItem index={index} item={item} onLongPress={() => selectFile(index)} onPress={() => handlePress(item, index)} headStyle={[item.isError && multiStyles.errorItemHead, item.isSelected && multiStyles.selectItemHead]} bodyStyle={[item.isError && multiStyles.errorItemBody, item.isSelected && multiStyles.selectItemBody]} bottomStyle={[item.isError && multiStyles.errorItemBottom, item.isSelected && multiStyles.selectItemBottom]} onFetchChunk={onFetchChunk} // <--- The new callback /> ); };
This ensures the SearchedItem only has the current chunk in memory. As soon as the user clicks left or right arrow, it asks the parent to supply the relevant piece of text. You are no longer passing or storing the entire file content for each result.
SearchedItem
, you can replicate the same logic using React hooks (useState
, useEffect
, etc.).matchIndex
goes out of bounds).With these changes, clicking the left or right arrows will correctly update the snippet shown (the chunk
), display the new score, and show the correct “match X of N” index.