-
Notifications
You must be signed in to change notification settings - Fork 349
Closed
Labels
ADLApplies to Alder Lake platformApplies to Alder Lake platformP1Blocker bugs or important featuresBlocker bugs or important featuresZephyrIssues only observed with Zephyr integratedIssues only observed with Zephyr integratedbugSomething isn't working as expectedSomething isn't working as expectedduplicateThis issue or pull request already existsThis issue or pull request already exists
Milestone
Description
Describe the bug
On ADLP platform, HW params ipc failed on stream 1 when testing playback. Firmware is built with zephyr xcc.
[ 130.968054] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: ipc rx: 0x90020000: GLB_TRACE_MSG: DMA_POSITION
[ 130.968055] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: stream_tag 1
[ 130.968061] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: ipc tx: 0x60010000: GLB_STREAM_MSG: PCM_PARAMS
[ 130.968090] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: ipc rx done: 0x90020000: GLB_TRACE_MSG: DMA_POSITION
[ 130.968230] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: ipc tx error for 0x60010000 (msg/reply size: 108/20): -22
[ 130.968236] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: HW params ipc failed for stream 1
[ 130.968239] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: ASoC: error at snd_soc_pcm_component_hw_params on 0000:00:1f.3: -22
[ 130.968244] kernel: Speakers: ASoC: __soc_pcm_hw_params() failed (-22)
To Reproduce
aplay -Dhw:0,0 -r 48000 -c 2 -f S16_LE -d 3 /dev/zero -v -q
Reproduction Rate
100%
Expected behavior
no IPC errors when playing audio
Impact
HW params ipc failed
Environment
Kernel: sof-dev/4d8f25303dc0
SOF: main/2c0d875
Topology: sof-adl-nocodec.tplg
Platform: ADLP-RVP-NOCODEC
Screenshots or console output
2022-02-24 07:09:42 UTC [REMOTE_COMMAND] aplay -Dhw:0,0 -r 48000 -c 2 -f S16_LE -d 3 /dev/zero -v -q
aplay: set_params:1407: Unable to install hw params:
ACCESS: RW_INTERLEAVED
FORMAT: S16_LE
SUBFORMAT: STD
SAMPLE_BITS: 16
FRAME_BITS: 32
CHANNELS: 2
RATE: 48000
PERIOD_TIME: (85333 85334)
PERIOD_SIZE: 4096
PERIOD_BYTES: 16384
PERIODS: 4
BUFFER_TIME: (341333 341334)
BUFFER_SIZE: 16384
BUFFER_BYTES: 65536
TICK_TIME: 0
2022-02-24 07:09:42 UTC [REMOTE_ERROR] aplay on PCM hw:0,0 failed at 1/1.
dmesg
[ 131.383146] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: DAI config 0 matches pcm hw params
[ 131.383150] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: DAI config 0 matches pcm hw params
[ 131.383154] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: rate_min: 48000 rate_max: 48000
[ 131.383158] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: channels_min: 2 channels_max: 2
[ 131.383171] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: pcm: hw params stream 0 dir 0
[ 131.383187] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: FW Poll Status: reg[0x160]=0x40000 successful
[ 131.383211] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: FW Poll Status: reg[0x160]=0x40000 successful
[ 131.383219] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: hda_dsp_stream_setup_bdl: period_bytes:0x4000
[ 131.383225] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: hda_dsp_stream_setup_bdl: periods:4
[ 131.383276] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: ipc tx: 0x40070000: GLB_PM_MSG: CORE_ENABLE
[ 131.383685] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: ipc tx error for 0x40070000 (msg/reply size: 12/12): -22
[ 131.383711] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: error: failed to enable target core for widget PIPELINE.7.SSP0.OUT
[ 131.383722] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: Failed to set up connected widgets
[ 131.383732] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: error: failed widget list set up for pcm 0 dir 0
[ 131.383743] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: ASoC: error at snd_soc_pcm_component_hw_params on 0000:00:1f.3: -22
[ 131.383760] kernel: Port0: ASoC: __soc_pcm_hw_params() failed (-22)
[ 131.383780] kernel: Port0: ASoC: dpcm_fe_dai_hw_params failed (-22)
[ 131.383793] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: pcm: free stream 0 dir 0
[ 131.384482] kernel: sof-audio-pci-intel-tgl 0000:00:1f.3: pcm: close stream 0 dir 0
logger
07:09:42 +0000 UTC
[ 242310490.579780] ( 0.000000) c0 dma-trace src/trace/dma-trace.c:343 INFO DMA: FW ABI 0x3014001 DBG ABI 0x5003000 tags SOF:v2.0-rc1-507-gbe752fdc3f2a zephyr:BUILD_VERSION src hash 0xbe752fdc (ldc hash 0xbe752fdc)
[ 131.458328] ( 131.458328) c0 zll-schedule src/schedule/zephyr_ll.c:287 INFO task add 0x9e074db0 dma-trace-task <2b972272-c5b1-4b7e-926f-0fc5cb4c4690> priority 4 flags 0x0
[ 2190.572830] ( 2059.114502) c0 ipc src/ipc/ipc3/handler.c:1579 INFO ipc: new cmd 0x40020000
[ 2210.156162] ( 19.583332) c0 ipc src/ipc/ipc3/handler.c:663 INFO ipc: pm -> restore
[ 8178.228842] ( 5968.072754) c0 ipc src/ipc/ipc3/handler.c:1579 INFO ipc: new cmd 0x40070000
[ 8200.051757] ( 21.822916) c0 ipc src/ipc/ipc3/handler.c:685 ERROR ipc: CONFIG_CORE_COUNT: 2 < core enable mask: 5
[ 2043508.252132] ( 2035308.250000) c0 zll-schedule src/schedule/zephyr_ll.c:142 INFO ll task 0x9e075418 agent-work avg 48, max 341
[ 2043613.408377] ( 105.156242) c0 zll-schedule ......../zephyr_domain.c:89 INFO ll timer avg 350, max 2184, overruns 0
[ 2073513.198856] ( 29899.791016) c0 zll-schedule src/schedule/zephyr_ll.c:142 INFO ll task 0x9e074db0 dma-trace-task <2b972272-c5b1-4b7e-926f-0fc5cb4c4690> avg 29, max 1386
[ 3093259.564585] ( 1019746.375000) c0 ipc src/ipc/ipc3/handler.c:1579 INFO ipc: new cmd 0x90040000
[ 3093278.158334] ( 18.593750) c0 ipc src/ipc/ipc3/handler.c:915 INFO ipc: debug cmd 0x40000
[ 3093296.595833] ( 18.437500) c0 ipc src/ipc/ipc3/handler.c:885 INFO ipc: trace_filter_update received, size 2 elems
[ 3093308.575000] ( 11.979166) c0 ipc src/trace/trace.c:435 INFO Adaptive filtering disabled by user
[ 3093324.147916] ( 15.572916) c0 ipc src/ipc/ipc3/handler.c:902 INFO trace_filter_update for UUID key 0x207B0, comp -1.-1 affected 1 components
[ 3119357.792715] ( 26033.644531) c0 ipc src/ipc/ipc3/handler.c:1579 INFO ipc: new cmd 0x90040000
[ 3119368.730214] ( 10.937500) c0 ipc src/ipc/ipc3/handler.c:915 INFO ipc: debug cmd 0x40000
[ 3119379.198964] ( 10.468750) c0 ipc src/ipc/ipc3/handler.c:885 INFO ipc: trace_filter_update received, size 2 elems
[ 3119396.438546] ( 17.239582) c0 ipc src/ipc/ipc3/handler.c:902 INFO trace_filter_update for UUID key 0x207B0, comp -1.-1 affected 1 components
[ 4091498.118668] ( 972101.687500) c0 zll-schedule src/schedule/zephyr_ll.c:142 INFO ll task 0x9e075418 agent-work avg 48, max 53
[ 4091592.910331] ( 94.791664) c0 zll-schedule ......../zephyr_domain.c:89 INFO ll timer avg 355, max 1962, overruns 0
[ 4121504.627893] ( 29911.716797) c0 zll-schedule src/schedule/zephyr_ll.c:142 INFO ll task 0x9e074db0 dma-trace-task <2b972272-c5b1-4b7e-926f-0fc5cb4c4690> avg 32, max 1356
[ 6139497.308122] ( 2017992.625000) c0 zll-schedule src/schedule/zephyr_ll.c:142 INFO ll task 0x9e075418 agent-work avg 48, max 50
[ 6139591.735201] ( 94.427078) c0 zll-schedule ......../zephyr_domain.c:89 INFO ll timer avg 352, max 1951, overruns 0
warn: failed to fread() 4 params from the log for ../../src/schedule/zephyr_ll.c:142
warn: log's End Of File. Device suspend?
---- Re-opening trace input file; device suspend? -----
[ 263238094.696108] ( 0.000000) c0 dma-trace src/trace/dma-trace.c:343 INFO DMA: FW ABI 0x3014001 DBG ABI 0x5003000 tags SOF:v2.0-rc1-507-gbe752fdc3f2a zephyr:BUILD_VERSION src hash 0xbe752fdc (ldc hash 0xbe752fdc)
[ 130.937495] ( 130.937500) c0 zll-schedule src/schedule/zephyr_ll.c:287 INFO task add 0x9e074db0 dma-trace-task <2b972272-c5b1-4b7e-926f-0fc5cb4c4690> priority 4 flags 0x0
[ 1052.135375] ( 921.197876) c0 ipc src/ipc/ipc3/handler.c:1579 INFO ipc: new cmd 0x40020000
[ 1071.822874] ( 19.687500) c0 ipc src/ipc/ipc3/handler.c:663 INFO ipc: pm -> restore
[ 2827.604054] ( 1755.781128) c0 ipc src/ipc/ipc3/handler.c:1579 INFO ipc: new cmd 0x40070000
[ 2849.322803] ( 21.718750) c0 ipc src/ipc/ipc3/handler.c:685 ERROR ipc: CONFIG_CORE_COUNT: 2 < core enable mask: 5
[ 2043346.220888] ( 2040496.875000) c0 zll-schedule src/schedule/zephyr_ll.c:142 INFO ll task 0x9e075418 agent-work avg 48, max 340
[ 2043451.741717] ( 105.520828) c0 zll-schedule ......../zephyr_domain.c:89 INFO ll timer avg 350, max 2188, overruns 0
[ 2073352.573862] ( 29900.832031) c0 zll-schedule src/schedule/zephyr_ll.c:142 INFO ll task 0x9e074db0 dma-trace-task <2b972272-c5b1-4b7e-926f-0fc5cb4c4690> avg 29, max 1393
[ 3260313.568363] ( 1186961.000000) c0 ipc src/ipc/ipc3/handler.c:1579 INFO ipc: new cmd 0x90040000
[ 3260332.266279] ( 18.697916) c0 ipc src/ipc/ipc3/handler.c:915 INFO ipc: debug cmd 0x40000
[ 3260350.703779] ( 18.437500) c0 ipc src/ipc/ipc3/handler.c:885 INFO ipc: trace_filter_update received, size 2 elems
[ 3260362.735028] ( 12.031249) c0 ipc src/trace/trace.c:435 INFO Adaptive filtering disabled by user
[ 3260378.255861] ( 15.520833) c0 ipc src/ipc/ipc3/handler.c:902 INFO trace_filter_update for UUID key 0x207B0, comp -1.-1 affected 1 components
[ 3291983.098355] ( 31604.841797) c0 ipc src/ipc/ipc3/handler.c:1579 INFO ipc: new cmd 0x90040000
[ 3291993.983771] ( 10.885416) c0 ipc src/ipc/ipc3/handler.c:915 INFO ipc: debug cmd 0x40000
[ 3292004.400437] ( 10.416666) c0 ipc src/ipc/ipc3/handler.c:885 INFO ipc: trace_filter_update received, size 2 elems
[ 3292021.587937] ( 17.187500) c0 ipc src/ipc/ipc3/handler.c:902 INFO trace_filter_update for UUID key 0x207B0, comp -1.-1 affected 1 components
[ 4091336.504092] ( 799314.937500) c0 zll-schedule src/schedule/zephyr_ll.c:142 INFO ll task 0x9e075418 agent-work avg 48, max 53
[ 4091431.295754] ( 94.791664) c0 zll-schedule ......../zephyr_domain.c:89 INFO ll timer avg 355, max 1964, overruns 0
[ 4121343.065399] ( 29911.769531) c0 zll-schedule src/schedule/zephyr_ll.c:142 INFO ll task 0x9e074db0 dma-trace-task <2b972272-c5b1-4b7e-926f-0fc5cb4c4690> avg 32, max 1357
Metadata
Metadata
Assignees
Labels
ADLApplies to Alder Lake platformApplies to Alder Lake platformP1Blocker bugs or important featuresBlocker bugs or important featuresZephyrIssues only observed with Zephyr integratedIssues only observed with Zephyr integratedbugSomething isn't working as expectedSomething isn't working as expectedduplicateThis issue or pull request already existsThis issue or pull request already exists