diff --git a/specs/task-android-timeline-freeze.spec.md b/specs/task-android-timeline-freeze.spec.md new file mode 100644 index 00000000..85f535b0 --- /dev/null +++ b/specs/task-android-timeline-freeze.spec.md @@ -0,0 +1,145 @@ +spec: task +name: "Android Timeline Freeze — Bound UI Work And Stop Full Timeline Snapshots" +inherits: project +tags: [bugfix, android, timeline, performance, matrix] +estimate: 1.5d +--- + +## Intent + +阻止 Android 端 Robrix timeline 页面在运行一段时间后卡死或接近 ANR。当前风险点在 Robrix 本体:后台 timeline 订阅通过无界通道向 UI 发送更新,`RoomScreen::process_timeline_updates` 在一次 `Event::Signal` 中无限 drain 队列,同时 timeline subscriber 每次增量 timeline update 都 clone 整条 timeline snapshot。目标是迁移 Element X 类似的 diff-driven timeline 思路,让 UI 线程只消费有界批次和局部变化,避免小更新被放大成全量拷贝、全量扫描和长时间主线程占用。 + +## Constraints + +- 本任务限定修复 Robrix 本体的 timeline update 热路径;没有 ANR trace 或平台调用栈证据时,不修改 Makepad 平台层 +- UI 线程每次 `Event::Signal` 消费 timeline updates 的工作量必须有硬上限,不能无界 `try_recv` 直到队列清空 +- 后台 timeline subscriber 不得在每个 `TimelineUpdate::ItemsChanged` 中发送整条 `timeline_items.clone()` snapshot +- UI 侧 `TimelineUiState.items` 仍然是当前房间 timeline 的权威 UI 数据源,后台只发送 diff/metadata,UI 按 diff 更新本地 items +- 必须保持现有滚动位置语义:append 时在底部保持 tail,不在底部时显示 jump-to-bottom;前置插入/clear_cache 时尽量保持当前可见 item +- 不新增 cargo 依赖 +- 不运行 `cargo fmt` + +## Decisions + +- Add `MAX_TIMELINE_UPDATES_PER_SIGNAL` in `src/home/room_screen.rs` and use `should_yield_timeline_update_batch(processed_updates)` to cap timeline updates per `Event::Signal` +- When `process_timeline_updates` reaches the per-signal update budget and the receiver still has pending updates, call `SignalToUI::set_ui_signal()` so remaining work is resumed in a later UI event +- Replace full-snapshot incremental updates with `TimelineUpdate::ItemsChanged { item_diffs, changed_indices, is_append, clear_cache }` +- Introduce a Robrix-owned diff enum for UI application, covering the Matrix SDK diff operations currently handled in `timeline_subscriber_handler`: Append, PushFront, PushBack, PopFront, PopBack, Insert, Set, Remove, Truncate, Reset, Clear +- Keep `TimelineUpdate::FirstUpdate { initial_items }` as the initial snapshot path; only incremental timeline updates after the first update must avoid full snapshot cloning +- Apply item diffs on the UI thread to `tl.items` before running scroll-position adjustment, cache invalidation, streaming detection, bot discovery, and redraw decisions +- Limit bot discovery for incremental updates to `changed_indices` unless `clear_cache == true`, using a helper such as `timeline_update_scan_range(clear_cache, changed_indices, new_len)` +- Preserve `SignalToUI::set_ui_signal()` calls from async background tasks after enqueueing timeline updates + +## Boundaries + +### Allowed Changes +- `src/home/room_screen.rs` +- `src/sliding_sync.rs` +- `docs/plans/2026-05-07-android-timeline-freeze.md` +- `specs/task-android-timeline-freeze.spec.md` + +### Forbidden +- Do NOT modify Makepad platform crates or Android platform event-loop code in this task +- Do NOT change Matrix SDK upstream APIs or fork behavior +- Do NOT change message sending semantics, read receipt semantics, or pagination request semantics +- Do NOT rewrite the timeline UI widgets or message rendering design +- Do NOT add new crates or change `Cargo.toml` +- Do NOT run `cargo fmt` + +## Acceptance Criteria + +Scenario: Root cause is documented as Robrix UI-thread amplification, not assumed Makepad freeze + Test: manual_review_android_timeline_freeze_contract + Given this task contract is reviewed + When the reviewer reads the Intent and Constraints + Then the problem statement names `RoomScreen::process_timeline_updates` unbounded queue draining + And the problem statement names `sliding_sync::timeline_subscriber_handler` full timeline snapshot cloning + And the contract does not require Makepad platform changes + +Scenario: timeline update batch helper yields at the configured budget + Test: timeline_update_batch_yields_after_budget + Given `MAX_TIMELINE_UPDATES_PER_SIGNAL` is configured + When `should_yield_timeline_update_batch` is called with values below the limit + Then it returns `false` + When it is called with the limit or a value above the limit + Then it returns `true` + +Scenario: process_timeline_updates does not drain an unbounded backlog in one Signal + Test: manual_test_android_timeline_update_backlog_is_batched + Given a RoomScreen timeline update receiver has more than `MAX_TIMELINE_UPDATES_PER_SIGNAL` pending updates + When one `Event::Signal` is handled + Then at most `MAX_TIMELINE_UPDATES_PER_SIGNAL` timeline updates are applied in that call + And `SignalToUI::set_ui_signal()` is requested for the remaining backlog + And the UI thread returns to the Makepad event loop before processing the remaining updates + +Scenario: incremental timeline updates no longer clone the entire timeline snapshot + Test: manual_review_timeline_subscriber_sends_diffs_not_full_snapshot + Level: static_review + Verification Strength: source_inspection + Targets: src/sliding_sync.rs, src/home/room_screen.rs + Given `sliding_sync::timeline_subscriber_handler` receives SDK timeline diffs after the first update + When it enqueues an incremental `TimelineUpdate` + Then the update is `TimelineUpdate::ItemsChanged` and contains Robrix-owned item diffs and metadata + And it does not contain `new_items: timeline_items.clone()` + And `TimelineUpdate::FirstUpdate` remains the only full initial snapshot path + +Scenario: UI applies append and set diffs to the existing TimelineUiState items + Test: timeline_items_apply_append_and_set_diffs + Given `tl.items` contains two timeline items + When a diff update appends one item and sets the second item + Then `tl.items` contains three items + And the second item reflects the set item + And no full timeline replacement is required + +Scenario: UI applies removal and reset diffs without panicking + Test: timeline_items_apply_remove_truncate_clear_reset_diffs + Given `tl.items` contains multiple timeline items + When diff updates remove, truncate, clear, and reset items in valid SDK order + Then `tl.items` matches the expected item order after each operation + And no panic occurs for valid indices emitted by the SDK + +Scenario: bot discovery scans only changed indices for incremental updates + Test: timeline_update_scan_range_limits_incremental_work + Given `clear_cache == false` + And `changed_indices == 10..12` + And the new timeline length is 1000 + When the bot discovery scan range is computed + Then the range is `10..12` + When `clear_cache == true` + Then the range is `0..1000` + +Scenario: scroll position semantics are preserved after diff-based updates + Test: manual_test_timeline_scroll_position_preserved_after_diff_updates + Given the user is viewing a room timeline away from the bottom + When older events are inserted before the current visible range + Then the same visible event remains anchored after the update + And the timeline does not jump to the bottom + When a new message is appended while the user is at the bottom + Then the timeline remains in tail mode + +Scenario: Android large-room active timeline remains responsive during update bursts + Test: manual_test_android_large_room_timeline_remains_responsive + Given an Android device logged into an account with a large active room + And the room receives a burst of timeline updates + When the user scrolls the timeline or taps the input bar during the burst + Then touch input is still processed + And no Android ANR dialog appears + And logcat does not show the main thread stuck inside `process_timeline_updates` for multiple seconds + +Scenario: build and focused timeline tests pass + Test: cargo_check_and_timeline_tests_green + When the developer runs `cargo check` + And the developer runs `cargo clippy` + And the developer runs `cargo test timeline_update_batch_yields_after_budget` + And the developer runs `cargo test timeline_update_scan_range_limits_incremental_work` + And the developer runs `cargo test timeline_items_apply_append_and_set_diffs` + And the developer runs `cargo test timeline_items_apply_remove_truncate_clear_reset_diffs` + Then all commands complete with exit code `0` + +## Out of Scope + +- Fixing iOS IME candidate-bar / keyboard frame handling in Makepad platform code +- Changing Makepad `PortalList` internals +- Replacing Makepad with native Android `RecyclerView`, Compose `LazyColumn`, or iOS `UITableView` +- Full room-list startup pagination refactor for `entries_with_dynamic_adapters(usize::MAX)` +- Media cache redesign, avatar cache redesign, and message rendering visual redesign diff --git a/src/app.rs b/src/app.rs index 7da3840e..8e773689 100644 --- a/src/app.rs +++ b/src/app.rs @@ -3086,8 +3086,10 @@ mod tests { #[test] fn test_app_state_roundtrip_preserves_selected_room_with_empty_dock() { - let mut state = AppState::default(); - state.selected_room = Some(joined_room("!room:example.org", "octosbot")); + let state = AppState { + selected_room: Some(joined_room("!room:example.org", "octosbot")), + ..AppState::default() + }; assert!( state.saved_dock_state_home.open_rooms.is_empty(), "precondition: this test simulates the mobile case where selected_room persists without desktop dock tabs", diff --git a/src/home/room_screen.rs b/src/home/room_screen.rs index 733bd7fa..47315766 100644 --- a/src/home/room_screen.rs +++ b/src/home/room_screen.rs @@ -60,6 +60,7 @@ const BLURHASH_IMAGE_MAX_SIZE: u32 = 500; /// Use a larger batch when we are trying to fill the initial viewport, /// otherwise many short messages can trigger a long chain of tiny paginations. const VIEWPORT_FILL_PAGINATION_SIZE: u16 = 150; +const MAX_TIMELINE_UPDATES_PER_SIGNAL: usize = 8; const TOPIC_PREVIEW_CHARS: usize = 140; const ROOM_INFO_PANE_DESKTOP_WIDTH: f32 = 320.0; const ROOM_INFO_PANE_MOBILE_BREAKPOINT: f32 = 700.0; @@ -1026,10 +1027,9 @@ fn compute_translation_lang_popup_abs_pos(button_rect: Rect, container_rect: Rec dvec2(popup_x, popup_y) } -fn streaming_scan_range( +fn timeline_update_scan_range( clear_cache: bool, changed_indices: &Range, - _old_len: usize, new_len: usize, ) -> Range { if clear_cache { @@ -1041,6 +1041,67 @@ fn streaming_scan_range( } } +fn should_yield_timeline_update_batch(processed_updates: usize) -> bool { + processed_updates >= MAX_TIMELINE_UPDATES_PER_SIGNAL +} + +pub enum TimelineDiff { + Append { values: Vector }, + Clear, + Insert { index: usize, value: T }, + Set { index: usize, value: T }, + Remove { index: usize }, + PushFront { value: T }, + PushBack { value: T }, + PopFront, + PopBack, + Truncate { length: usize }, + Reset { values: Vector }, +} + +fn apply_timeline_diffs( + items: &mut Vector, + diffs: impl IntoIterator>, +) { + for diff in diffs { + match diff { + TimelineDiff::Append { values } => { + items.append(values); + } + TimelineDiff::Clear => { + items.clear(); + } + TimelineDiff::Insert { index, value } => { + items.insert(index, value); + } + TimelineDiff::Set { index, value } => { + items.set(index, value); + } + TimelineDiff::Remove { index } => { + items.remove(index); + } + TimelineDiff::PushFront { value } => { + items.push_front(value); + } + TimelineDiff::PushBack { value } => { + items.push_back(value); + } + TimelineDiff::PopFront => { + items.pop_front(); + } + TimelineDiff::PopBack => { + items.pop_back(); + } + TimelineDiff::Truncate { length } => { + items.truncate(length); + } + TimelineDiff::Reset { values } => { + *items = values; + } + } + } +} + fn refresh_stream_indices<'a, I>( event_ids: I, streaming_messages: &mut HashMap, @@ -5790,10 +5851,13 @@ impl RoomScreen { Some(plaintext_body_of_timeline_item(event)) } - fn discover_known_bot_user_ids_from_timeline_items( + fn discover_known_bot_user_ids_from_timeline_items<'a, I>( app_state: &AppState, - timeline_items: &Vector>, - ) -> Vec { + timeline_items: I, + ) -> Vec + where + I: IntoIterator>, + { let Ok(parent_bot_user_id) = app_state .bot_settings .resolved_bot_user_id(current_user_id().as_deref()) @@ -6116,7 +6180,15 @@ impl RoomScreen { let mut should_continue_backwards_pagination = false; let mut typing_users = None; let mut num_updates = 0; - while let Ok(update) = tl.update_receiver.try_recv() { + let mut has_pending_updates = false; + loop { + if should_yield_timeline_update_batch(num_updates) { + has_pending_updates = !tl.update_receiver.is_empty(); + break; + } + let Ok(update) = tl.update_receiver.try_recv() else { + break; + }; num_updates += 1; match update { TimelineUpdate::FirstUpdate { initial_items } => { @@ -6158,12 +6230,22 @@ impl RoomScreen { } done_loading = true; } - TimelineUpdate::NewItems { new_items, changed_indices, is_append, clear_cache } => { + TimelineUpdate::ItemsChanged { item_diffs, changed_indices, is_append, clear_cache } => { + let old_items = tl.items.clone(); + apply_timeline_diffs(&mut tl.items, item_diffs); + if let Some(app_state) = app_state { + let discovery_range = timeline_update_scan_range( + clear_cache, + &changed_indices, + tl.items.len(), + ); let discovered_bot_user_ids = Self::discover_known_bot_user_ids_from_timeline_items( app_state, - &new_items, + tl.items + .focus() + .narrow(discovery_range), ); if !discovered_bot_user_ids.is_empty() { Cx::post_action(AppStateAction::KnownBotUserIdsDiscovered { @@ -6171,9 +6253,9 @@ impl RoomScreen { }); } } - if new_items.is_empty() { - if !tl.items.is_empty() { - log!("process_timeline_updates(): timeline (had {} items) was cleared for room {}", tl.items.len(), tl.kind.room_id()); + if tl.items.is_empty() { + if !old_items.is_empty() { + log!("process_timeline_updates(): timeline (had {} items) was cleared for room {}", old_items.len(), tl.kind.room_id()); // For now, we paginate a cleared timeline in order to be able to show something at least. // A proper solution would be what's described below, which would be to save a few event IDs // and then either focus on them (if we're not close to the end of the timeline) @@ -6205,12 +6287,12 @@ impl RoomScreen { let prior_items_changed = clear_cache || changed_indices.start <= curr_first_id; - if new_items.len() == tl.items.len() { + if tl.items.len() == old_items.len() { // log!("process_timeline_updates(): no jump necessary for updated timeline of same length: {}", items.len()); } - else if curr_first_id > new_items.len() { - log!("process_timeline_updates(): jumping to bottom: curr_first_id {} is out of bounds for {} new items", curr_first_id, new_items.len()); - portal_list.set_first_id_and_scroll(new_items.len().saturating_sub(1), 0.0); + else if curr_first_id > tl.items.len() { + log!("process_timeline_updates(): jumping to bottom: curr_first_id {} is out of bounds for {} new items", curr_first_id, tl.items.len()); + portal_list.set_first_id_and_scroll(tl.items.len().saturating_sub(1), 0.0); portal_list.set_tail_range(true); jump_to_bottom_button.update_visibility(cx, true); } @@ -6219,7 +6301,7 @@ impl RoomScreen { // which ensures that the timeline doesn't jump around unexpectedly and ruin the user's experience. else if let Some((curr_item_idx, new_item_idx, new_item_scroll, _event_id)) = prior_items_changed.then(|| - find_new_item_matching_current_item(cx, portal_list, curr_first_id, &tl.items, &new_items) + find_new_item_matching_current_item(cx, portal_list, curr_first_id, &old_items, &tl.items) ) .flatten() { @@ -6253,12 +6335,12 @@ impl RoomScreen { } } - let start = changed_indices.start.min(new_items.len()); - let end = changed_indices.end.min(new_items.len()); + let start = changed_indices.start.min(tl.items.len()); + let end = changed_indices.end.min(tl.items.len()); let mut accepted_users: Vec = Vec::new(); let mut room_members_changed = false; for idx in start..end { - let Some(new_item) = new_items.get(idx) else { continue }; + let Some(new_item) = tl.items.get(idx) else { continue }; let TimelineItemKind::Event(event_tl_item) = new_item.kind() else { continue }; let TimelineItemContent::MembershipChange(membership_change) = event_tl_item.content() else { continue }; if is_append { @@ -6302,7 +6384,7 @@ impl RoomScreen { if let LoadingPaneState::BackwardsPaginateUntilEvent { events_paginated, target_event_id, .. } = &mut loading_pane_state { - *events_paginated += new_items.len().saturating_sub(tl.items.len()); + *events_paginated += tl.items.len().saturating_sub(old_items.len()); log!("While finding target event {target_event_id}, we have now loaded {events_paginated} messages..."); // Here, we assume that we have not yet found the target event, // so we need to continue paginating backwards. @@ -6330,28 +6412,27 @@ impl RoomScreen { let previous_streaming_messages = std::mem::take(&mut tl.streaming_messages); let (rebuilt_streaming_messages, should_schedule_frame) = rebuild_streaming_messages_for_full_snapshot( - streaming_candidates_from_items(&new_items), + streaming_candidates_from_items(&tl.items), Some(&previous_streaming_messages), ); tl.streaming_messages = rebuilt_streaming_messages; if should_schedule_frame { self.streaming_next_frame = cx.new_next_frame(); } - } else if !new_items.is_empty() { + } else if !tl.items.is_empty() { let mut should_schedule_frame = false; - let scan_range = streaming_scan_range( + let scan_range = timeline_update_scan_range( clear_cache, &changed_indices, tl.items.len(), - new_items.len(), ); - let old_event_ids: HashSet<&EventId> = tl.items.iter() + let old_event_ids: HashSet<&EventId> = old_items.iter() .filter_map(|item| item_event_id(item)) .collect(); for idx in scan_range { - let Some(new_item) = new_items.get(idx) else { continue }; + let Some(new_item) = tl.items.get(idx) else { continue }; let TimelineItemKind::Event(new_evt) = new_item.kind() else { continue }; let Some(event_id) = new_evt.event_id().map(|id| id.to_owned()) else { continue }; let live = is_msc4357_live(new_evt); @@ -6406,7 +6487,6 @@ impl RoomScreen { } // --- End streaming detection --- - tl.items = new_items; refresh_stream_indices( tl.items.iter().map(item_event_id), &mut tl.streaming_messages, @@ -6499,7 +6579,7 @@ impl RoomScreen { if direction == PaginationDirection::Backwards { tl.backwards_pagination_in_flight = false; // Don't set `done_loading` to `true` here, because we want to keep the top space visible - // (with the "loading" message) until the corresponding `NewItems` update is received. + // (with the "loading" message) until the corresponding timeline item update is received. tl.fully_paginated = fully_paginated; if fully_paginated { done_loading = true; @@ -6707,6 +6787,9 @@ impl RoomScreen { self.schedule_stream_timeout(cx); // log!("Applied {} timeline updates for room {}, redrawing with {} items...", num_updates, tl.kind.room_id(), tl.items.len()); self.redraw(cx); + if has_pending_updates { + SignalToUI::set_ui_signal(); + } } } @@ -8199,10 +8282,10 @@ pub enum TimelineUpdate { /// The initial list of timeline items (events) for a room. initial_items: Vector>, }, - /// The content of a room's timeline was updated in the background. - NewItems { - /// The entire list of timeline items (events) for a room. - new_items: Vector>, + /// The content of a room's timeline changed in the background. + ItemsChanged { + /// The ordered list of changes to apply to the current timeline items. + item_diffs: Vec>>, /// The range of indices in the `items` list that have been changed in this update /// and thus must be removed from any caches of drawn items in the timeline. /// Any items outside of this range are assumed to be unchanged and need not be redrawn. @@ -11044,15 +11127,57 @@ mod tests { } #[test] - fn test_streaming_scan_range() { - // Incremental: clamp sentinel to new_len - assert_eq!(streaming_scan_range(false, &(5..usize::MAX), 8, 9), 5..9); - // Append: new item at end is scanned - assert_eq!(streaming_scan_range(false, &(8..9), 8, 9), 8..9); - // No changes: empty range - assert_eq!(streaming_scan_range(false, &(8..8), 8, 8), 8..8); - // Clear cache: full scan - assert_eq!(streaming_scan_range(true, &(5..usize::MAX), 8, 9), 0..9); + fn timeline_update_scan_range_limits_incremental_work() { + assert_eq!(timeline_update_scan_range(false, &(5..usize::MAX), 9), 5..9); + assert_eq!(timeline_update_scan_range(false, &(8..9), 9), 8..9); + assert_eq!(timeline_update_scan_range(false, &(8..8), 8), 8..8); + assert_eq!(timeline_update_scan_range(true, &(5..usize::MAX), 9), 0..9); + assert_eq!(timeline_update_scan_range(false, &(10..12), 1000), 10..12); + assert_eq!(timeline_update_scan_range(false, &(998..usize::MAX), 1000), 998..1000); + assert_eq!(timeline_update_scan_range(true, &(10..12), 1000), 0..1000); + } + + #[test] + fn timeline_update_batch_yields_after_budget() { + assert!(!should_yield_timeline_update_batch(0)); + assert!(!should_yield_timeline_update_batch(MAX_TIMELINE_UPDATES_PER_SIGNAL - 1)); + assert!(should_yield_timeline_update_batch(MAX_TIMELINE_UPDATES_PER_SIGNAL)); + assert!(should_yield_timeline_update_batch(MAX_TIMELINE_UPDATES_PER_SIGNAL + 1)); + } + + #[test] + fn timeline_items_apply_append_and_set_diffs() { + let mut items = Vector::from(vec![1, 2]); + + apply_timeline_diffs( + &mut items, + [ + TimelineDiff::Append { values: Vector::from(vec![3]) }, + TimelineDiff::Set { index: 1, value: 20 }, + TimelineDiff::Insert { index: 1, value: 10 }, + ], + ); + + assert_eq!(items.into_iter().collect::>(), vec![1, 10, 20, 3]); + } + + #[test] + fn timeline_items_apply_remove_truncate_clear_reset_diffs() { + let mut items = Vector::from(vec![1, 2, 3, 4]); + + apply_timeline_diffs( + &mut items, + [ + TimelineDiff::PopFront, + TimelineDiff::PopBack, + TimelineDiff::Remove { index: 1 }, + TimelineDiff::Truncate { length: 1 }, + TimelineDiff::Clear, + TimelineDiff::Reset { values: Vector::from(vec![9, 8]) }, + ], + ); + + assert_eq!(items.into_iter().collect::>(), vec![9, 8]); } #[test] diff --git a/src/sliding_sync.rs b/src/sliding_sync.rs index 1aa40fde..7f539044 100644 --- a/src/sliding_sync.rs +++ b/src/sliding_sync.rs @@ -45,7 +45,7 @@ use hashbrown::{HashMap, HashSet}; use crate::{ account_manager::{self, Account}, app::{AppStateAction, RoomFilterRemoteSearchAction}, app_data_dir, avatar_cache::AvatarUpdate, event_preview::{BeforeText, TextPreview, text_preview_of_raw_timeline_event, text_preview_of_timeline_item}, home::{ - add_room::{CreatableSpacesAction, CreateRoomAction, CreateRoomContext, KnockResultAction}, invite_screen::{JoinRoomResultAction, LeaveRoomResultAction}, link_preview::{LinkPreviewData, LinkPreviewDataNonNumeric, LinkPreviewRateLimitResponse}, room_screen::{ActionResponseResultAction, InviteResultAction, ReportRoomResultAction, TimelineUpdate}, rooms_list::{self, InvitedRoomInfo, InviterInfo, JoinedRoomInfo, RoomsListUpdate, build_room_search_text, enqueue_rooms_list_update}, rooms_list_header::RoomsListHeaderAction, tombstone_footer::SuccessorRoomDetails + add_room::{CreatableSpacesAction, CreateRoomAction, CreateRoomContext, KnockResultAction}, invite_screen::{JoinRoomResultAction, LeaveRoomResultAction}, link_preview::{LinkPreviewData, LinkPreviewDataNonNumeric, LinkPreviewRateLimitResponse}, room_screen::{ActionResponseResultAction, InviteResultAction, ReportRoomResultAction, TimelineDiff, TimelineUpdate}, rooms_list::{self, InvitedRoomInfo, InviterInfo, JoinedRoomInfo, RoomsListUpdate, build_room_search_text, enqueue_rooms_list_update}, rooms_list_header::RoomsListHeaderAction, tombstone_footer::SuccessorRoomDetails }, homeserver::{CapabilityProbeAction, HsCapabilities, IdentityProviderSummary}, login::login_screen::LoginAction, logout::{logout_confirm_modal::LogoutAction, logout_state_machine::{LogoutConfig, is_logout_in_progress, logout_with_state_machine}}, media_cache::{MediaCacheEntry, MediaCacheEntryRef}, persistence::{self, ClientSessionPersisted, load_app_state, take_skip_app_state_restore_once}, profile::{ user_profile::UserProfile, user_profile_cache::{UserProfileUpdate, enqueue_user_profile_update}, @@ -1587,13 +1587,15 @@ mod matrix_request_tests { #[test] fn test_should_restore_loaded_app_state_with_selected_room_and_empty_dock() { - let mut app_state = crate::app::AppState::default(); - app_state.selected_room = Some(crate::app::SelectedRoom::JoinedRoom { - room_name_id: crate::utils::RoomNameId::new( - matrix_sdk::RoomDisplayName::Named("octosbot".into()), - "!room:example.org".parse().unwrap(), - ), - }); + let app_state = crate::app::AppState { + selected_room: Some(crate::app::SelectedRoom::JoinedRoom { + room_name_id: crate::utils::RoomNameId::new( + matrix_sdk::RoomDisplayName::Named("octosbot".into()), + "!room:example.org".parse().unwrap(), + ), + }), + ..crate::app::AppState::default() + }; assert!( should_restore_loaded_app_state(&app_state), @@ -6401,21 +6403,25 @@ async fn timeline_subscriber_handler( let mut clear_cache = false; // whether the changes include items being appended to the end of the timeline let mut is_append = false; + let mut item_diffs = Vec::new(); for diff in batch { num_updates += 1; match diff { VectorDiff::Append { values } => { let _values_len = values.len(); index_of_first_change = min(index_of_first_change, timeline_items.len()); - timeline_items.extend(values); + let diff_values = values.clone(); + timeline_items.append(values); index_of_last_change = max(index_of_last_change, timeline_items.len()); if LOG_TIMELINE_DIFFS { log!("timeline_subscriber: room {room_id}, thread {thread_root_event_id:?} diff Append {_values_len}. Changes: {index_of_first_change}..{index_of_last_change}"); } is_append = true; + item_diffs.push(TimelineDiff::Append { values: diff_values }); } VectorDiff::Clear => { if LOG_TIMELINE_DIFFS { log!("timeline_subscriber: room {room_id}, thread {thread_root_event_id:?} diff Clear"); } clear_cache = true; timeline_items.clear(); + item_diffs.push(TimelineDiff::Clear); } VectorDiff::PushFront { value } => { if LOG_TIMELINE_DIFFS { log!("timeline_subscriber: room {room_id}, thread {thread_root_event_id:?} diff PushFront"); } @@ -6426,14 +6432,16 @@ async fn timeline_subscriber_handler( } clear_cache = true; - timeline_items.push_front(value); + timeline_items.push_front(value.clone()); + item_diffs.push(TimelineDiff::PushFront { value }); } VectorDiff::PushBack { value } => { index_of_first_change = min(index_of_first_change, timeline_items.len()); - timeline_items.push_back(value); + timeline_items.push_back(value.clone()); index_of_last_change = max(index_of_last_change, timeline_items.len()); if LOG_TIMELINE_DIFFS { log!("timeline_subscriber: room {room_id}, thread {thread_root_event_id:?} diff PushBack. Changes: {index_of_first_change}..{index_of_last_change}"); } is_append = true; + item_diffs.push(TimelineDiff::PushBack { value }); } VectorDiff::PopFront => { if LOG_TIMELINE_DIFFS { log!("timeline_subscriber: room {room_id}, thread {thread_root_event_id:?} diff PopFront"); } @@ -6443,12 +6451,14 @@ async fn timeline_subscriber_handler( *i = i.saturating_sub(1); // account for the first item being removed. } // This doesn't affect whether we should reobtain the latest event. + item_diffs.push(TimelineDiff::PopFront); } VectorDiff::PopBack => { timeline_items.pop_back(); index_of_first_change = min(index_of_first_change, timeline_items.len()); index_of_last_change = usize::MAX; if LOG_TIMELINE_DIFFS { log!("timeline_subscriber: room {room_id}, thread {thread_root_event_id:?} diff PopBack. Changes: {index_of_first_change}..{index_of_last_change}"); } + item_diffs.push(TimelineDiff::PopBack); } VectorDiff::Insert { index, value } => { if index == 0 { @@ -6471,14 +6481,16 @@ async fn timeline_subscriber_handler( .map(|(i, ev)| (i + index, ev)); } - timeline_items.insert(index, value); + timeline_items.insert(index, value.clone()); if LOG_TIMELINE_DIFFS { log!("timeline_subscriber: room {room_id}, thread {thread_root_event_id:?} diff Insert at {index}. Changes: {index_of_first_change}..{index_of_last_change}"); } + item_diffs.push(TimelineDiff::Insert { index, value }); } VectorDiff::Set { index, value } => { index_of_first_change = min(index_of_first_change, index); index_of_last_change = max(index_of_last_change, index.saturating_add(1)); - timeline_items.set(index, value); + timeline_items.set(index, value.clone()); if LOG_TIMELINE_DIFFS { log!("timeline_subscriber: room {room_id}, thread {thread_root_event_id:?} diff Set at {index}. Changes: {index_of_first_change}..{index_of_last_change}"); } + item_diffs.push(TimelineDiff::Set { index, value }); } VectorDiff::Remove { index } => { if index == 0 { @@ -6495,6 +6507,7 @@ async fn timeline_subscriber_handler( } timeline_items.remove(index); if LOG_TIMELINE_DIFFS { log!("timeline_subscriber: room {room_id}, thread {thread_root_event_id:?} diff Remove at {index}. Changes: {index_of_first_change}..{index_of_last_change}"); } + item_diffs.push(TimelineDiff::Remove { index }); } VectorDiff::Truncate { length } => { if length == 0 { @@ -6505,11 +6518,13 @@ async fn timeline_subscriber_handler( } timeline_items.truncate(length); if LOG_TIMELINE_DIFFS { log!("timeline_subscriber: room {room_id}, thread {thread_root_event_id:?} diff Truncate to length {length}. Changes: {index_of_first_change}..{index_of_last_change}"); } + item_diffs.push(TimelineDiff::Truncate { length }); } VectorDiff::Reset { values } => { if LOG_TIMELINE_DIFFS { log!("timeline_subscriber: room {room_id}, thread {thread_root_event_id:?} diff Reset, new length {}", values.len()); } clear_cache = true; // we must assume all items have changed. - timeline_items = values; + timeline_items = values.clone(); + item_diffs.push(TimelineDiff::Reset { values }); } } } @@ -6530,14 +6545,14 @@ async fn timeline_subscriber_handler( if LOG_TIMELINE_DIFFS { log!("timeline_subscriber: applied {num_updates} updates for room {room_id}, thread {thread_root_event_id:?}, timeline now has {} items. is_append? {is_append}, clear_cache? {clear_cache}. Changes: {changed_indices:?}.", timeline_items.len()); } - timeline_update_sender.send(TimelineUpdate::NewItems { - new_items: timeline_items.clone(), + timeline_update_sender.send(TimelineUpdate::ItemsChanged { + item_diffs, changed_indices, clear_cache, is_append, - }).expect("Error: timeline update sender couldn't send update with new items!"); + }).expect("Error: timeline update sender couldn't send timeline item diffs!"); - // We must send this update *after* the actual NewItems update, + // We must send this update *after* the actual ItemsChanged update, // otherwise the UI thread (RoomScreen) won't be able to correctly locate the target event. if let Some((index, found_event_id)) = found_target_event_id.take() { target_event_id = None;