I’m using an Intel Edison kit and the code for it is written in C++ using the Intel System Studio IOT Edition. I use mraa library functions (libmraa) to transmit data through the UART.
Everything works as expected when I debug in the Nextion Editor environment.
However, while working with the Nextion Enhanced NX4827K043R, I noticed some unexpected events which I could not handle.
The problem is this:
After sending each command to the Nextion module, I encounter a large amount of 0xff bytes after reading the buffer contents.
I use the following parameters:
With these parameters, it would be natural to suppose that I can be sure that I will not receive any reply generated by the Nextion module after sending something to it. Nevertheless, a large amount of 0xff bytes are returned (although the amount differs for every command).
A 17 byte command (including 0xff,0xff,0xff) will return 16 0xff bytes.
A 47 byte command returns 42 0xff bytes.
A 35 byte command returns 31 0xff bytes.
How can I get rid of this redundant data which is sent back after every transmitted command?
I think you will enjoy it very well.
I now look at coding my timing methods differently with an accurate visual reading of output.
It increased my productivity, less guessing and time chasing ghosts, less trial and error.
So many years without one, but a nice addition to any tool set.
Knowing what I know now, it would be nice to give my 1990's self a heads up about them
After reading through your opinion from left to right and vice versa and then flipping it upside down to get a fresh view, and after having applied all of the deductive methods at my disposal I have made the decision that Logic Pro 8 is the way to go :-)
Thanks for the insight!
A bigger better tool is always nice to have, and may save you from a repurchase in the future for future needs.
I say may, as future capabilities become even more capable, future may also out grow the current upper end.
It really depends on what your analyzing and what budget you want to put into your tools.
And analyzing a protocol is just the high/lows, 0 or 1s.
Then there are Oscilloscopes to give voltages and the minor variation that could be interfering.
There are enough tools out there to exhaust any reasonable budget =)
So its a balance of what we need now to get us to the next steps.
Well, I'd narrow down the choice to Saleae Logic 8 and Logic Pro 8.
Which one of these two would you say is ideal for an average Intel Edison for Arduino + Nextion display project set up? Should I go for 100 MS/s and 25 MHz or is 500MS/s and 100MHz a safer option in case I need anything extra for future jobs?
You want to select a tool that fits your needs
- How deep are you getting into the timings of what your sampling
My searching around found there are a few open-source, and other logic analyzers that will provide the high low graphing, but the main difference between Saleae's and higher and these is their software. So, an example is a selection of protocol analyzers I2C, Serial, SPI, etc. Makes a huge difference in "calculating" high/low timings into bytes or it to display it on screen for you. So the software is a time saver.
Sample speed also very important to what you need to select. You need to have enough MHz sampling for what you are analyzing. The higher speed of the protocol the more you'd want in MHz. You don't want to sample say 10MHz comms with a 10Mbps sample rate, you'll want room for oversampling. Of course the higher the speed,
the more system resources needed. It's a balance of want and practical, and steeper prices will keep your want's in check =).
How many channels to get, and this depends on what your working on. I think it is less common to really use 8 at the same time, let alone 16. If your job has you using 16 or more, you have specialized equipment just so you don't have a tangle of leads.
Every day tasks, depends what your every day tasks are. Logic Analyzers are a help when you have an issue, say in timing or confirm for peace of mind. You might break it out, hook it up, keep it close by, check a few things - but its unplugged and packed up when you find your issue. If it is more constant than that, you might have a higher solution involving pogo pins.
I think Saleae hits a good price point, I think their software is good. Certainly doesn't take much effort to connect, make a couple configurations, run a sample and have an answer. How many channels depends on how complex your projects.
Could you help me to choose USB Logic Analyzer to buy.
As I understand you use Saleae Logic. Which one from their models would be sufficient for every day tasks?
I am sorry. I can see we are drifting a bit away from your issue.
How is setting time interval between bits and between bytes not baud?
Okay perhaps technically you could do so between bytes, but why? A serial byte is defined by the bits with an expected min/max bit width identified by the first data bit but and finalized with the stop bit. This byte total width for these 10 bits has to fall within the baud rate timing with a very small allowance of deviation allotted to compensate for timers. (corrected error of parity leading, N81 was my temporary insanity)
Nextion sends out its reply when processing has completed. As this is interpreted by firmware, this can be near immediate on no load, to near 3ms, to several ms on an overloaded serial command queue. Very much dependant on what the command requested (output a system variable - little time until output. Stuff 1K of eeprom much more time, tigger a click for a custom screen paint pixel by pixel ~180,000ms). But is certainly done at the end of processing.
Adjusting the timings of the transmission will not have much effect unless you're exploring bit banging. The timings of the Nextion USART hardware module are stated in the STM32/GD32 specifications (model dependent which chip is used as the Nextion side MCU). But for 115200 baud, the bit width needs to be a constant 1/115200th of a second (or as near constant as the mismatched clocks will allow for). This timing needs to be as constant across the whole of the bit width - as it is generally oversampled and is determined to be high or low by the average (compensation for noise on the line), Most bit banging implementations do not spend so much time to compute the average over every bit but select three or four (say 6th,9th,and 13th bits of a 16 bit oversampling). The Edison as you can imaging has more than enough compute power to hand this, as you see with a 3.68Mbps - more than enough power to handle Nextion's top baudrate of 115200.
Now as seen in the Edison Module electrical characteristics, this is high on idle (TTL) and matches Nextion's TTL high on idle nature. But once again this is for something that is NOT the Arduino board for Edison - which is why I asked if there was any hardware chips identified on the Arduino board as they could indeed interfere with this nature provided by the Edison Module when such hardware exists between the Edison module connector pins and the final pins you are using for your D0/D1.
When System Studio IoT describes the same as rs232 (which would be characteristic of low on idle), this change is either the difference in a change of configurable pins, or I have to presume there is something between Edison Module and your D0/D1 pins. Could possibly just be an error - but this and that your are experiencing a stream of 0xFFs. Something is causing such.
Patrick, I can also set the time interval between bits and between bytes during reading and writing.
What intervals does Nextion use?
The info I sent you came from the Intel shield testing report for Galileo 1, 2 and Edison.
The file attached contains the UART information from the Intel Edison Compute Module Hardware Guide.
Note there is a difference between Arduino's SoftwareSerial library being
"supported" and bit-banging out a software serial using two digital pins in code.
Arduino's SoftwareSerial implementation is just too specific to be used
Use of named Serial and Serial1 objects is mostly for Arduino familiarity.
Referencing Galileo doesn't have any bearing for an Edison
- Galileo is a Quark (80486),
- Edison is dual core Pentium with an internal Quark as MCU
- the Quark MCU is not accessible in this manner
> The primary serial port on Galileo 1, Galileo 2, and Edison is a single wire,
I do not see how this can be correct, As shown first by D0/D1, as second
as Intel implemented the UART as RX/TX/CTS/RTS (albeit bit-banged)
If you have an Edison, then only Edison info is relevant.
- the platforms are indeed different enough to have specific requirements
But I was referring to physically identifying if there was any hardware chip
that would act as a Hardware UART on the Arduino Board itself, as the
Edison Module itself certainly does not have any such on the chip.
- This step of doing so will most like influence if the pins can be configured
or if such a hardware chip locks this into place "as is where is"
Galileo serial ports
Software serial ports are not supported on Galileo 1, Galileo 2, and Edison boards. It is not expected to be supported in the future. The Intel boards do support Hardware Serial that is referenced simply as serial ports. The serial ports are typically used from the IDE for debugging; therefore, it can be confusing if a shield or breakout board utilizes the serial port through digital pins.
The primary serial port on Galileo 1, Galileo 2, and Edison is a single wire, however, in the sketch they are in fact two serial ports that are automatically multiplexed when a sketch is downloading. However, in the sketch itself, the multiplex serial port is referenced as two objects called ‘Serial’ and ‘Serial1’. The ‘Serial’ object is the serial port on the IDE. ‘Serial1’ is the serial port connected to D0/D1 for transmit and receive of data on those digital pins.
For Galileo 1, ‘Serial2’ is the audio connector typically used for a login shell. To use it in a sketch may require some remapping since it is tied to a login process (this has not been investigated).
For Galileo 2, ‘Serial2’ is mapped into D2/D3 for transmit and receive. The ‘Serial2’ object can be used as a second serial port accessible through a sketch. In G2, there exist another serial port next to the Ethernet port that can be used as the serial console (previously used by G1 and the audio connector).
For Edison, additional serial ports have not been investigated.
Table 4 Sketch serial ports Board
So the behaviour effected by the bkcmd setting
- bkcmd=0 has no companied fail/pass responses, least chatty indeed
- bkcmd=3 is most chatty with fail/pass responses
(2 is only on fail, 1 is only on pass)
For clarification, there was no assertion 0xFF count was divisible by 3
- if using Nextion methodology, Termination is 3 bytes of
- reading stream before byte, there is no method to count from an unknown when
Since you "exclusively" use print/printh
- there will be minimal terminations sent by Nextion (but not zero).
We have eliminated Nextion side through behaviour and line Logic output.
Stream is coming from misinterpretation (it simply is not there)
The phantom 0xFFs is Edison side of the RX/TX pair (but yet not there)
So now we look at your Edison side
This may be beyond Nextion scope, but perhaps still a discussion
I find conflicting definitions of the Edison UART
1) As merely the Edison Module, there is no UART Hardware
this is a indeed a bit-banged muxed io implementation for the SOC
2) The Arduino of Things version described as TTL (high on idle nature)
(Arduino of Things is beyond Intel support scope, but appears compatible)
(Arduino board for Edison is indeed Arduino certified)
3) System Studio IoT describes as rs232 (low on idle nature)
This is certainly not necessary to be in conflict of each other as the Module
does not actually have a hardware module for UART, then this is either
completely handled by software OR your Arduino board has something extra.
To me, this seems to be "settable". Certainly Arduino of Things claims an
implementation of TTL, IoT claims rs232. Perhaps it is settable.
But your libmraa does attempt to be a wrapper for underlying gpio control
as such you state no (you have yet to find) an exposed setting for TTL/rs232
Examine your board to determine if ANY hardware UART even exists.
But indeed if IoT was expecting an rs232 low on idle, and Nextion is TTL high
then the probability of bit banged serial reads could interpret idle as an 0xFF
somewhere ignoring stop/parity or misinterpreting line noise.
So a few paths may exist
- rs232 to TTL adapter? (unknown, just throwing it out there)
- bit-bang software serial on two digital pins you can configure
- hack your libmraa
- bypass your libmraa
- Trial of Arduino of Things (my bet this may be less flawed)
- break out your Oscilloscopes, ( I showed Logic Analyzer already)
Patrick, at first I used bkcmd=2, then =0 and have now switched back to using =2. Nothing changes.
Also, the amount of these dubious 0xFFs is not divisible by 3.
That was just a description of the parameters I use. I didn't literally code that way (I mentioned this in a follow-up reply).
Andrew, I see you stated in the opening notes that you set the following...
"I use the following parameters:
I've been puzzled as to why you experienced this odd data flow.
However you didn't say where you included these settings?
The baud setting should be bauds=9600
Also there should be no semi-colons terminating these items.
I can't get an HMI to compile with such errors, so how did you make these settings?
Nextion Instruction Set :: Nextion Return Data.
That was my assumption on the terminators going out from Nextion
This is the indeed the Nextion way, providing some structure
Touch events 0x65, page changes 0x66 Coord 0x67/0x68
String 0x70, Numbers 0x71, Nextion Ready 0x88 etc.
But certainly print/printh can indeed be used
So what is your Nextion's bkcmd set at?
This can indeed be a source of many terminated replies