要开始使用软件包,我已经尝试了开发人员提供的代码。我在Windows 7 64位计算机上,使用了nexus 4,现在将nexus 5x虚拟机与android 5以及现在与Android Q一起使用。除此之外,还有一部真正的带有android 7.0的手机。手机的车载文本到语音功能可完美运行。在每种配置中,除非我单击“初始化”,否则界面实际上似乎没有响应-除了在Android Studio的“运行”窗口中之外,该应用程序没有任何反应。这是读数:
Launching lib\main.dart on Android SDK built for x86 64 in debug mode... Running Gradle task 'assembleDebug'... √ Built build\app\outputs\apk\debug\app-debug.apk. Installing build\app\outputs\apk\app.apk... Flutter is taking longer than expected to report its views. Still trying... Debug service listening on ws://127.0.0.1:53319/B_tNUT2pR7I=/ws Syncing files to device Android SDK built for x86 64... W/xample.flutter( 5738): Accessing hidden method Landroid/view/accessibility/AccessibilityNodeInfo;->getSourceNodeId()J (greylist, reflection, allowed) W/xample.flutter( 5738): Accessing hidden method Landroid/view/accessibility/AccessibilityRecord;->getSourceNodeId()J (greylist, reflection, allowed) W/xample.flutter( 5738): Accessing hidden field Landroid/view/accessibility/AccessibilityNodeInfo;->mChildNodeIds:Landroid/util/LongArray; (greylist, reflection, allowed) W/xample.flutter( 5738): Accessing hidden method Landroid/util/LongArray;->get(I)J (greylist, reflection, allowed) I/Choreographer( 5738): Skipped 53 frames! The application may be doing too much work on its main thread. D/HostConnection( 5738): HostConnection::get() New Host Connection established 0x70249825f180, tid 5786 D/HostConnection( 5738): HostComposition ext ANDROID_EMU_CHECKSUM_HELPER_v1 ANDROID_EMU_dma_v1 ANDROID_EMU_direct_mem ANDROID_EMU_host_composition_v1 ANDROID_EMU_host_composition_v2 ANDROID_EMU_YUV420_888_to_NV21 ANDROID_EMU_YUV_Cache ANDROID_EMU_async_unmap_buffer GL_OES_EGL_image_external_essl3 GL_OES_vertex_array_object GL_KHR_texture_compression_astc_ldr ANDROID_EMU_gles_max_version_3_0 D/HostConnection( 5738): HostConnection::get() New Host Connection established 0x702497d3e6a0, tid 5955 D/HostConnection( 5738): HostComposition ext ANDROID_EMU_CHECKSUM_HELPER_v1 ANDROID_EMU_dma_v1 ANDROID_EMU_direct_mem ANDROID_EMU_host_composition_v1 ANDROID_EMU_host_composition_v2 ANDROID_EMU_YUV420_888_to_NV21 ANDROID_EMU_YUV_Cache ANDROID_EMU_async_unmap_buffer GL_OES_EGL_image_external_essl3 GL_OES_vertex_array_object GL_KHR_texture_compression_astc_ldr ANDROID_EMU_gles_max_version_3_0 D/eglCodecCommon( 5738): setVertexArrayObject: set vao to 0 (0) 0 0 D/EGL_emulation( 5738): eglCreateContext: 0x702497d3e740: maj 3 min 0 rcv 3 D/eglCodecCommon( 5738): setVertexArrayObject: set vao to 0 (0) 0 0 D/EGL_emulation( 5738): eglCreateContext: 0x70249825f220: maj 3 min 0 rcv 3 D/EGL_emulation( 5738): eglMakeCurrent: 0x70249825f220: ver 3 0 (tinfo 0x702498218680) W/Gralloc3( 5738): mapper 3.x is not supported D/HostConnection( 5738): createUnique: call D/HostConnection( 5738): HostConnection::get() New Host Connection established 0x70249825f360, tid 5786 D/HostConnection( 5738): HostComposition ext ANDROID_EMU_CHECKSUM_HELPER_v1 ANDROID_EMU_dma_v1 ANDROID_EMU_direct_mem ANDROID_EMU_host_composition_v1 ANDROID_EMU_host_composition_v2 ANDROID_EMU_YUV420_888_to_NV21 ANDROID_EMU_YUV_Cache ANDROID_EMU_async_unmap_buffer GL_OES_EGL_image_external_essl3 GL_OES_vertex_array_object GL_KHR_texture_compression_astc_ldr ANDROID_EMU_gles_max_version_3_0 D/eglCodecCommon( 5738): allocate: Ask for block of size 0x1000 D/eglCodecCommon( 5738): allocate: ioctl allocate returned offset 0x3ff807000 size 0x2000 D/HostConnection( 5738): createUnique: call D/HostConnection( 5738): HostConnection::get() New Host Connection established 0x702448a9ff00, tid 5955 D/HostConnection( 5738): HostComposition ext ANDROID_EMU_CHECKSUM_HELPER_v1 ANDROID_EMU_dma_v1 ANDROID_EMU_direct_mem ANDROID_EMU_host_composition_v1 ANDROID_EMU_host_composition_v2 ANDROID_EMU_YUV420_888_to_NV21 ANDROID_EMU_YUV_Cache ANDROID_EMU_async_unmap_buffer GL_OES_EGL_image_external_essl3 GL_OES_vertex_array_object GL_KHR_texture_compression_astc_ldr ANDROID_EMU_gles_max_version_3_0 D/EGL_emulation( 5738): eglMakeCurrent: 0x702497d3e740: ver 3 0 (tinfo 0x70244d193360) D/EGL_emulation( 5738): eglMakeCurrent: 0x70249825f220: ver 3 0 (tinfo 0x702498218680) D/eglCodecCommon( 5738): setVertexArrayObject: set vao to 0 (0) 1 0 I/OpenGLRenderer( 5738): Davey! duration=16105ms; Flags=1, IntendedVsync=320862128930, Vsync=321745462228, OldestInputEvent=9223372036854775807, NewestInputEvent=0, HandleInputStart=321750930400, AnimationStart=321751011720, PerformTraversalsStart=321751071720, DrawStart=330820804190, SyncQueued=331199393480, SyncStart=331258181510, IssueDrawCommandsStart=331421431670, SwapBuffers=336112080650, FrameCompleted=337026255970, DequeueBufferDuration=550559000, QueueBufferDuration=35074000, I/Choreographer( 5738): Skipped 953 frames! The application may be doing too much work on its main thread. D/EGL_emulation( 5738): eglMakeCurrent: 0x702497d3e740: ver 3 0 (tinfo 0x70244d193360) D/eglCodecCommon( 5738): setVertexArrayObject: set vao to 0 (0) 0 0 I/Choreographer( 5738): Skipped 40 frames! The application may be doing too much work on its main thread. I/OpenGLRenderer( 5738): Davey! duration=782ms; Flags=0, IntendedVsync=349271346526, Vsync=349938013166, OldestInputEvent=9223372036854775807, NewestInputEvent=0, HandleInputStart=349954335870, AnimationStart=349954427840, PerformTraversalsStart=349997031300, DrawStart=349997268140, SyncQueued=349997407090, SyncStart=350019841510, IssueDrawCommandsStart=350019994270, SwapBuffers=350027291360, FrameCompleted=350076153750, DequeueBufferDuration=149000, QueueBufferDuration=9594000, I/OpenGLRenderer( 5738): Davey! duration=818ms; Flags=0, IntendedVsync=350655153383, Vsync=350755153379, OldestInputEvent=9223372036854775807, NewestInputEvent=0, HandleInputStart=350758548020, AnimationStart=350758679460, PerformTraversalsStart=350933225780, DrawStart=351140119140, SyncQueued=351140550980, SyncStart=351159859490, IssueDrawCommandsStart=351159930940, SwapBuffers=351165057370, FrameCompleted=351492805850, DequeueBufferDuration=292974000, QueueBufferDuration=727000, W/xample.flutter( 5738): Verification of androidx.core.view.DragAndDropPermissionsCompat androidx.core.app.ActivityCompat.requestDragAndDropPermissions(android.app.Activity, android.view.DragEvent) took 201.593ms E/flutter ( 5738): [ERROR:flutter/lib/ui/ui_dart_state.cc(157)] Unhandled Exception: PlatformException(multipleRequests, Only one initialize at a time, null) E/flutter ( 5738): #0 StandardMethodCodec.decodeEnvelope (package:flutter/src/services/message_codecs.dart:569:7) E/flutter ( 5738): #1 MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:156:18) E/flutter ( 5738): <asynchronous suspension> E/flutter ( 5738): #2 MethodChannel.invokeMethod (package:flutter/src/services/platform_channel.dart:329:12) E/flutter ( 5738): #3 SpeechToText.initialize (package:speech_to_text/speech_to_text.dart:174:10) E/flutter ( 5738):
#4 _MyAppState.initSpeechState (package:flutter6/main.dart:34:35) E/flutter ( 5738): #5 _InkResponseState._handleTap (package:flutter/src/material/ink_well.dart:779:19) E/flutter ( 5738):
#6 _InkResponseState.build.<anonymous closure> (package:flutter/src/material/ink_well.dart:862:36) E/flutter ( 5738):
#7 GestureRecognizer.invokeCallback (package:flutter/src/gestures/recognizer.dart:182:24) E/flutter ( 5738): #8 TapGestureRecognizer.handleTapUp (package:flutter/src/gestures/tap.dart:504:11) E/flutter ( 5738): #9 BaseTapGestureRecognizer._checkUp (package:flutter/src/gestures/tap.dart:282:5) E/flutter ( 5738): #10 BaseTapGestureRecognizer.handlePrimaryPointer (package:flutter/src/gestures/tap.dart:217:7) E/flutter ( 5738): #11 PrimaryPointerGestureRecognizer.handleEvent (package:flutter/src/gestures/recognizer.dart:475:9) E/flutter ( 5738): #12 PointerRouter._dispatch (package:flutter/src/gestures/pointer_router.dart:76:12) E/flutter ( 5738): #13 PointerRouter._dispatchEventToRoutes.<anonymous closure> (package:flutter/src/gestures/pointer_router.dart:122:9) E/flutter ( 5738): #14 _LinkedHashMapMixin.forEach (dart:collection-patch/compact_hash.dart:379:8) E/flutter ( 5738): #15 PointerRouter._dispatchEventToRoutes (package:flutter/src/gestures/pointer_router.dart:120:18) E/flutter ( 5738): #16 PointerRouter.route (package:flutter/src/gestures/pointer_router.dart:106:7) E/flutter ( 5738): #17 GestureBinding.handleEvent (package:flutter/src/gestures/binding.dart:218:19) E/flutter ( 5738):
#18 GestureBinding.dispatchEvent (package:flutter/src/gestures/binding.dart:198:22) E/flutter ( 5738):
#19 GestureBinding._handlePointerEvent (package:flutter/src/gestures/binding.dart:156:7) E/flutter ( 5738):
#20 GestureBinding._flushPointerEventQueue (package:flutter/src/gestures/binding.dart:102:7) E/flutter ( 5738):
#21 GestureBinding._handlePointerDataPacket (package:flutter/src/gestures/binding.dart:86:7) E/flutter ( 5738):
#22 _rootRunUnary (dart:async/zone.dart:1196:13) E/flutter ( 5738): #23 _CustomZone.runUnary (dart:async/zone.dart:1085:19) E/flutter ( 5738): #24 _CustomZone.runUnaryGuarded (dart:async/zone.dart:987:7) E/flutter ( 5738): #25 _invoke1 (dart:ui/hooks.dart:275:10) E/flutter ( 5738): #26
_dispatchPointerDataPacket (dart:ui/hooks.dart:184:5) E/flutter ( 5738): I/Choreographer( 5738): Skipped 31 frames! The application may be doing too much work on its main thread.
Android Studio,flutter和node已安装3天,我已经使用-v运行了flutter Doctor,并且没有使用该参数。以前,某些许可证未签名,现在已排序。
这是我从上一页粘贴的代码:
import 'dart:async';
import 'dart:math';
import 'package:flutter/material.dart';
import 'package:speech_to_text/speech_recognition_error.dart';
import 'package:speech_to_text/speech_recognition_result.dart';
import 'package:speech_to_text/speech_to_text.dart';
void main() => runApp(MyApp());
class MyApp extends StatefulWidget {
@override
_MyAppState createState() => _MyAppState();
}
class _MyAppState extends State<MyApp> {
bool _hasSpeech = false;
double level = 0.0;
double minSoundLevel = 50000;
double maxSoundLevel = -50000;
String lastWords = "";
String lastError = "";
String lastStatus = "";
String _currentLocaleId = "";
List<LocaleName> _localeNames = [];
final SpeechToText speech = SpeechToText();
@override
void initState() {
super.initState();
}
Future<void> initSpeechState() async {
bool hasSpeech = await speech.initialize(
onError: errorListener, onStatus: statusListener);
if (hasSpeech) {
_localeNames = await speech.locales();
var systemLocale = await speech.systemLocale();
_currentLocaleId = systemLocale.localeId;
}
if (!mounted) return;
setState(() {
_hasSpeech = hasSpeech;
});
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: const Text('Speech to Text Example'),
),
body: Column(children: [
Center(
child: Text(
'Speech recognition available',
style: TextStyle(fontSize: 22.0),
),
),
Container(
child: Column(
children: <Widget>[
Row(
mainAxisAlignment: MainAxisAlignment.spaceAround,
children: <Widget>[
FlatButton(
child: Text('Initialize'),
onPressed: _hasSpeech ? null : initSpeechState,
),
],
),
Row(
mainAxisAlignment: MainAxisAlignment.spaceAround,
children: <Widget>[
FlatButton(
child: Text('Start'),
onPressed: !_hasSpeech || speech.isListening
? null
: startListening,
),
FlatButton(
child: Text('Stop'),
onPressed: speech.isListening ? stopListening : null,
),
FlatButton(
child: Text('Cancel'),
onPressed: speech.isListening ? cancelListening : null,
),
],
),
Row(
mainAxisAlignment: MainAxisAlignment.spaceAround,
children: <Widget>[
DropdownButton(
onChanged: (selectedVal) => _switchLang(selectedVal),
value: _currentLocaleId,
items: _localeNames
.map(
(localeName) => DropdownMenuItem(
value: localeName.localeId,
child: Text(localeName.name),
),
)
.toList(),
),
],
)
],
),
),
Expanded(
flex: 4,
child: Column(
children: <Widget>[
Center(
child: Text(
'Recognized Words',
style: TextStyle(fontSize: 22.0),
),
),
Expanded(
child: Stack(
children: <Widget>[
Container(
color: Theme.of(context).selectedRowColor,
child: Center(
child: Text(
lastWords,
textAlign: TextAlign.center,
),
),
),
Positioned.fill(
bottom: 10,
child: Align(
alignment: Alignment.bottomCenter,
child: Container(
width: 40,
height: 40,
alignment: Alignment.center,
decoration: BoxDecoration(
boxShadow: [
BoxShadow(
blurRadius: .26,
spreadRadius: level * 1.5,
color: Colors.black.withOpacity(.05))
],
color: Colors.white,
borderRadius:
BorderRadius.all(Radius.circular(50)),
),
child: IconButton(icon: Icon(Icons.mic),
onPressed: () {
/*...*/
}
),
),
),
),
],
),
),
],
),
),
Expanded(
flex: 1,
child: Column(
children: <Widget>[
Center(
child: Text(
'Error Status',
style: TextStyle(fontSize: 22.0),
),
),
Center(
child: Text(lastError),
),
],
),
),
Container(
padding: EdgeInsets.symmetric(vertical: 20),
color: Theme.of(context).backgroundColor,
child: Center(
child: speech.isListening
? Text(
"I'm listening...",
style: TextStyle(fontWeight: FontWeight.bold),
)
: Text(
'Not listening',
style: TextStyle(fontWeight: FontWeight.bold),
),
),
),
]),
),
);
}
void startListening() {
lastWords = "";
lastError = "";
speech.listen(
onResult: resultListener,
listenFor: Duration(seconds: 10),
localeId: _currentLocaleId,
onSoundLevelChange: soundLevelListener,
cancelOnError: true,
partialResults: true);
setState(() {});
}
void stopListening() {
speech.stop();
setState(() {
level = 0.0;
});
}
void cancelListening() {
speech.cancel();
setState(() {
level = 0.0;
});
}
void resultListener(SpeechRecognitionResult result) {
setState(() {
lastWords = "${result.recognizedWords} - ${result.finalResult}";
});
}
void soundLevelListener(double level) {
minSoundLevel = min(minSoundLevel, level);
maxSoundLevel = max(maxSoundLevel, level);
//print("sound level $level: $minSoundLevel - $maxSoundLevel ");
setState(() {
this.level = level;
});
}
void errorListener(SpeechRecognitionError error) {
print("Received error status: $error, listening: ${speech.isListening}");
setState(() {
lastError = "${error.errorMsg} - ${error.permanent}";
});
}
void statusListener(String status) {
print(
"Received listener status: $status, listening: ${speech.isListening}");
setState(() {
lastStatus = "$status";
});
}
_switchLang(selectedVal) {
setState(() {
_currentLocaleId = selectedVal;
});
print(selectedVal);
}
}
我已经通过向yaml添加依赖项并导入了pub来导入该软件包,并且用尽了更多的构想。请帮忙。谢谢。
使用Speech_to_text插件的应用程序需要用户许可。您需要在Info.plist文件中添加iOS所需的密钥。对于Android,您需要在AndroidManifest.xml中添加录音音频权限。
以下是包装说明:https://pub.dev/packages/speech_to_text向下滚动到“权限”。在那里,您可以看到如何添加所有所说的内容。